Journal tags: change

13

sparkline

Increment by increment

The bedrock of the World Wide Web is solid. Built atop the protocols of the internet (TCP/IP), its fundamental building blocks remain: URLs of HTML files transmitted over HTTP. Baldur Bjarnason writes:

Even today, the web is like living fossil, a preserved relic from a different era. Anybody can put up a website. Anybody can run a business over it. I can build an app or service, send the URL to anybody I like, and most people in the world will be able to run it without asking anybody’s permission.

Still, the web has evolved. In fact, that evolution is something that’s also built into its fundamental design. Rather than try to optimise the World Wide Web for one particular use-case, Tim Berners-Lee realised the power of being flexible. Like the internet, the World Wide Web is deliberately dumb.

(I get very annoyed when people talk about the web as being designed for scientific work at CERN. That was merely the first use-case. The web was designed for everything …and nothing in particular.)

Robin Berjon compares the web’s evolution to the ship of Theseus:

That’s why it’s been so hard to agree about what the Web is: the Web is architected for resilience which means that it adapts and transforms. That flexibility is the reason why I’m talking about some mythological dude’s boat. Altogether too often, we consider some aspects of the Web as being invariants when they’re potentially just as replaceable as any other part. This isn’t to say that there are no invariants on the Web.

The web can be changed. That’s both a comfort and a warning. There’s plenty that we should change about today’s web. But there’s also plenty—at the root level—that we should fight to preserve.

And if you want change, the worst way to go about it is to promulgate the notion of burning everything down and starting from scratch. As Erin says in the fourth and final part of her devastating series on Meta in Myanmar:

We don’t get a do-over planet. We won’t get a do-over network.

Instead, we have to work with the internet we made and find a way to rebuild and fortify it to support the much larger projects of repair—political, cultural, environmental—that are required for our survival.

Though, as Robin points out, that doesn’t preclude us from sharing a vision:

Proceeding via small, incremental changes can be a laudable approach, but even then it helps to have a sense for what it is that those small steps are supposed to be incrementing towards.

I’m looking forward to reading what Robin puts forward, particularly because he says “I’m no technosolutionist.”

From a technical perspective, the web has never been better. We have incredible features in HTML, CSS, and JavaScript, all standardised and with amazing interoperability between browsers. The challenges that face the web today are not technical.

That’s one of the reasons why I have no patience for the web3 crowd. Apart from the ridiculous name, they’re focusing on exactly the wrong part of the stack.

Listening to their pitch, they’ll point out that while yes, the fundamental bedrock of the web is indeed decentralised—TCP/IP, HTTP(S)—what’s been constructed on that foundation is increasingly centralised; the power brokers of Google, Meta, Amazon.

And what’s the solution they propose? Replace the underlying infrastructure with something-something-blockchain.

Would that it were so simple.

The problems of today’s web are not technical in nature. The problems of today’s web won’t be solved by technology. If we’re going to solve the problems of today’s web, we’ll need to do it through law, culture, societal norms, and co-operation.

(Feel free to substitute “today’s web” with “tomorrow’s climate”.)

Innovation

I did an episode of the Clearleft podcast on innovation a while back:

Everyone wants to be innovative …but no one wants to take risks.

The word innovation is often bandied about in an unquestioned positive way. But if we acknowledge that innovation is—by definition—risky, then the exhortations sound less positive.

“We provide innovative solutions for businesses!” becomes “We provide risky solutions for businesses!”

I was reminded of this when I saw the website for the Podcast Standards Project. The original text on the website described the project as:

…a grassroots coalition working to establish modern, open standards, to enable innovation in the podcast industry.

I pushed back on that wording (partly because I’ve seen the word “innovation” used as a smoke screen for user-hostile practices like tracking and surveillance). The wording has since changed to:

…a grassroots coalition dedicated to creating standards and practices that improve the open podcasting ecosystem for both listeners and creators.

That’s better. It’s more precise.

Am I nitpicking? Only if you think that “innovation” and “improvement” are synonyms. I don’t think they are.

Innovation implies change. Improvement implies positive change.

Not all change is positive. Not all innovation is positive.

Innovation goes hand in hand with disruption. Again, disruption involves change. But not necessarily positive change.

Think about the antonyms of change and disruption: stasis and stability. Those words don’t sound very exciting, but in some arenas they’re exactly what you should be aiming for; arenas like infrastructure or standards.

Not to get all pace layers-y here, but it seems to me that every endeavour has a sweet spot for innovation. For some projects, too little innovation is bad. For others, too much innovation is worse.

The trick is knowing which kind of project you’re working on.

(As a side note, I think some people use the word innovation to describe the generative, divergent phase of a design project: “how might we come up with innovative new approaches?” But we already have a word to describe the practice of generating novel and interesting ideas. That word isn’t innovation. It’s creativity.)

Spring

Spring is arriving. It’s just taking its time.

There are little signs. Buds on the trees. The first asparagus of the year. Daffodils. Changing the clocks. A stretch in the evenings. But the weather remains, for the most part, chilly and grim.

Reality is refusing to behave like a fast-forward montage leading up to to a single day when you throw open the curtains and springtime is suddenly there in all its glory.

That’s okay. I can wait. I’ve had a lot of practice over the past three years. We all have. Staying home, biding time, saving lives.

But hunkering down during The Situation isn’t like taking shelter during an air raid. There isn’t a signal that sounds to indicate “all clear!” It’s more like going from Winter to Spring. It’s slow, almost impercetible. But it is happening.

I’ve noticed a subtle change in my risk assessment over the past few months. I still think about COVID-19. I still factor it into my calculations. But it’s no longer the first thing I think of.

That’s a subtle change. It doesn’t seem like that long ago when COVID was at the forefront of my mind, especially if I was weighing up an excursion. Is it worth going to that restaurant? How badly do I want to go to that gig? Should I go to that conference?

Now I find myself thinking of COVID as less of a factor in my decision-making. It’s still there, but it has slowly slipped down the ranking.

I know that other people feel differently. For some people, COVID slipped out of their minds long ago. For others, it’s still very much front and centre. There isn’t a consensus on how to evaluate the risks. Like I said:

It’s like when you’re driving and you think that everyone going faster than you is a maniac, and everyone going slower than you is an idiot.

COVID-19 isn’t going away. But perhaps The Situation is.

The Situation has been gradually fading away. There isn’t a single moment where, from one day to the next, we can say “this marks the point where The Situation ended.” Even if there were, it would be a different moment for everyone.

As of today, the COVID-19 app officially stops working. Perhaps today is as good a day as any to say Spring has arrived. The season of rebirth.

Brandolini’s blockchain

I’ve already written about how much I enjoyed hosting Leading Design San Francisco last week.

All the speakers were terrific. Lola’s talk was particularly …um, interesting:

In this talk, Lola will share her adventures in the world of blockchain, the hostility she experienced in her first go-round in 2018, and why she’s chosen to head back to a technology that is going through its largest reputational and social crisis to date.

Wait …I was supposed to stand on stage and introduce a talk that was (at least partly) about blockchain? I have opinions.

As it turned out, Lola warned me that I’d be making an appearance in her talk. She was going to quote that blog post. Before the talk, I asked her how obnoxious I could be about blockchain in her intro. She told me to bring it.

So in the introduction, I deployed all the sarcasm I had in me and said:

Listen, we designers have a tendency to be over-critical of things sometimes. There are all these ideas that we dismiss: phrenology, homeopathy, flat-earthism …blockchain. Haters gonna hate.

I remember somebody asking online a while back, “Why the hate for web3?” And someone I know responded by saying “We hate it because we understand it.” I think there’s a lot of truth to that.

But look, just because blockchains are powering crypto ponzi schemes and N F fucking Ts, it’s worth remembering that it’s also simply a technology. It’s a technological solution in search of a problem.

To be fair, it’s still early days. After all, it’s only been over a decade now.

It’s like the law of instrument says; when all you have is a hammer, everything looks like a nail. Blockchain is like that. Except the hammer is also made of glass.

Anyway, Lola is going to defend the indefensible and talk about blockchain. One thing to keep in mind is this: remember when everyone was talking about “The Cloud”? And then it turned out that you could substitute the phrase “someone else’s server” for “The Cloud?” Well, every time you hear Lola say the word “blockchain”, I’d like you to mentally substitute the phrase “multiple copies of a spreadsheet.”

Please give an open mind and a warm welcome to Lola Oyelayo Pearson!

I got some laughs. I also got lots of gasps and pearl-clutching, as though I were saying something taboo. Welcome to San Francisco.

Lola gave as good as she got. I got a roasting in her talk.

And just to clarify, Lola and I are friends—this was a consensual smackdown.

There was a very serious point to Lola’s talk. Cryptobollocks and other blockchain-powered schemes have historically been very bro-y, and exploitative of non-bro communities. Lola wants to fight that trend.

I get it. But it reminds me a bit of the justifications you hear from people who go to work at Facebook claiming that they can do more good from the inside. Whatever helps you sleep at night.

The crux of Lola’s belief is this: blockchain technology is inevitable, therefore it is uncumbent on us as ethical designers to ensure that the technology is deployed in a way that empowers people instead of exploiting them.

But I take issue with the premise. Blockchain technology is not inevitable. That’s the worst kind of technological determinism. It’s defeatist. It’s a depressing view of “progress” driven not by people, but by technological forces beyond our control.

I refuse to accept that anti-humanist deterministic view.

In any case, for technological determinism to have any validity, there needs to be something to it. At least virtual reality and machine learning are based on some actual technologies. In the case of cryptobollocks, there is no there there. There is nothing except the hype, which is why you’ll see blockchain enthusiasts trying to ride the coattails of trending technologies in a logical fallacy that goes something like this:

  1. There are technologies that will be really big in the future,
  2. blockchain is a technology, therefore
  3. blockchain will be really big in the future.

Blockchain is bullshit. It isn’t even very clever bullshit. And it certainly isn’t inevitable.

Change

I’ve spent the last few days in San Francisco where I was hosting Leading Design.

It was excellent. Rebecca did an absolutely amazing job with the curation, and the Clearleft delivered a terrific event, as always. I’m continually amazed by the way such a relatively small agency can punch above its weight when it comes to putting on world-class events and delivering client work.

I won’t go into much detail on what was shared at Leading Design. There’s an understanding that it’s a safe space for people to speak freely and share their experiences in an open and honest way. I can tell you that there were some tough topics. Given the recent rounds of layoffs in this neck of the woods, this was bound to happen.

I was chatting with Peter at breakfast on the second day and he was saying that maybe there was too much emphasis on the negative, like we were in danger of wallowing in our own misery. It’s a fair point, but I offered a counterpoint that I also heard other people express: when else do these people get a chance to let their guard down and have a good ol’ moan? These are design leaders who need to project an air of calm reassurance when they’re at work. Leading Design is a welcome opportunity to just let it all out.

When we did Leading Design in New York in March of 2022, it was an intimate gathering and the overwhelming theme was togetherness. After two years of screen-based interactions, it was cathartic to get together in the same location to swap stories and be reminded you are not alone.

Leading Design San Francisco was equally cathartic, but the theme this time was change. Change can be scary. But it can also be energising.

After two days of introducing and listening to fascinating talks on the topic of change, I closed out my duties by quoting the late great Octavia Butler. I spoke the mantra of the secular Earthseed religion founded in Parable Of The Sower:

All that you touch
You Change.

All that you Change
Changes you.

The only lasting truth
Is Change.

God
Is Change.

Pace layers and design principles

I think it was Jason who once told me that if you want to make someone’s life a misery, teach them about typography. After that they’ll be doomed to notice all the terrible type choices and kerning out there in the world. They won’t be able to unsee it. It’s like trying to unsee the arrow in the FedEx logo.

I think that Stewart Brand’s pace layers model is a similar kind of mind virus, albeit milder. Once you’ve been exposed to it, you start seeing in it in all kinds of systems.

Each layer is functionally different from the others and operates somewhat independently, but each layer influences and responds to the layers closest to it in a way that makes the whole system resilient.

Last month I sent out an edition of the Clearleft newsletter that was all about pace layers. I gathered together examples of people who have been infected with the pace-layer mindworm who were applying the same layered thinking to other areas:

My own little mash-up is applying pace layers to the World Wide Web. Tom even brought it to life as an animation.

See the Pen Web Layers Of Pace by Tom (@webrocker) on CodePen.

Recently I had another flare-up of the pace-layer pattern-matching infection.

I was talking to some visiting Austrian students on the weekend about design principles. I explained my mild obsession with design principles stemming from the fact that they sit between “purpose” (or values) and “patterns” (the actual outputs):

Purpose » Principles » Patterns

Your purpose is “why?”

That then influences your principles, “how?”

Those principles inform your patterns, “what?”

Hey, wait a minute! If you put that list in reverse order it looks an awful lot like the pace-layers model with the slowest moving layer at the bottom and the fastest moving layer at the top. Perhaps there’s even room for an additional layer when patterns go into production:

  • Production
  • Patterns
  • Principles
  • Purpose

Your purpose should rarely—if ever—change. Your principles can change, but not too frequently. Your patterns need to change quite often. And what you’re actually putting out into production should be constantly updated.

As you travel from the most abstract layer—“purpose”—to the most concrete layer—“production”—the pace of change increases.

I can’t tell if I’m onto something here or if I’m just being apopheniac. Again.

Upgrade paths

After I jotted down some quick thoughts last week on the disastrous way that Google Chrome rolled out a breaking change, others have posted more measured and incisive takes:

In fairness to Google, the Chrome team is receiving the brunt of the criticism because they were the first movers. Mozilla and Apple are on baord with making the same breaking change, but Google is taking the lead on this.

As I said in my piece, my issue was less to do with whether confirm(), prompt(), and alert() should be deprecated but more to do with how it was done, and the woeful lack of communication.

Thinking about it some more, I realised that what bothered me was the lack of an upgrade path. Considering that dialog is nowhere near ready for use, it seems awfully cart-before-horse-putting to first remove a feature and then figure out a replacement.

I was chatting to Amber recently and realised that there was a very different example of a feature being deprecated in web browsers…

We were talking about the KeyboardEvent.keycode property. Did you get the memo that it’s deprecated?

But fear not! You can use the KeyboardEvent.code property instead. It’s much nicer to use too. You don’t need to look up a table of numbers to figure out how to refer to a specific key on the keyboard—you use its actual value instead.

So the way that change was communicated was:

Hey, you really shouldn’t use the keycode property. Here’s a better alternative.

But with the more recently change, the communication was more like:

Hey, you really shouldn’t use confirm(), prompt(), or alert(). So go fuck yourself.

Foundations

There was quite a kerfuffle recently about a feature being removed from Google Chrome. To be honest, the details don’t really matter for the point I want to make, but for the record, this was about removing alert and confirm dialogs from cross-origin iframes (and eventually everywhere else too).

It’s always tricky to remove a long-established feature from web browsers, but in this case there were significant security and performance reasons. The problem was how the change was communicated. It kind of wasn’t. So the first that people found out about it about was when things suddenly stopped working (like CodePen embeds).

The Chrome team responded quickly and the change has now been pushed back to next year. Hopefully there will be significant communication before that to let site owners know about the upcoming breakage.

So all’s well that ends well and we’ve all learned a valuable lesson about the importance of communication.

Or have we?

While this was going on, Emily Stark tweeted a more general point about breakage on the web:

Breaking changes happen often on the web, and as a developer it’s good practice to test against early release channels of major browsers to learn about any compatibility issues upfront.

Yikes! To me, this appears wrong on almost every level.

First of all, breaking changes don’t happen often on the web. They are—and should be—rare. If that were to change, the web would suffer massively in terms of predictability.

Secondly, the onus is not on web developers to keep track of older features in danger of being deprecated. That’s on the browser makers. I sincerely hope we’re not expected to consult a site called canistilluse.com.

I wasn’t the only one surprised by this message.

Simon says:

No, no, no, no! One of the best things about developing for the web is that, as a rule, browsers don’t break old code. Expecting every website and application to have an active team of developers maintaining it at all times is not how the web should work!

Edward Faulkner:

Most organizations and individuals do not have the resources to properly test and debug their website against Chrome canary every six weeks. Anybody who published a spec-compliant website should be able to trust that it will keep working.

Evan You:

This statement seriously undermines my trust in Google as steward for the web platform. When did we go from “never break the web” to “yes we will break the web often and you should be prepared for it”?!

It’s worth pointing out that the original tweet was not an official Google announcement. As Emily says right there on her Twitter account:

Opinions are my own.

Still, I was shaken to see such a cavalier attitude towards breaking changes on the World Wide Web. I know that removing dangerous old features is inevitable, but it should also be exceptional. It should not be taken lightly, and it should certainly not be expected to be an everyday part of web development.

It’s almost miraculous that I can visit the first web page ever published in a modern web browser and it still works. Let’s not become desensitised to how magical that is. I know it’s hard work to push the web forward, constantly add new features, while also maintaining backward compatibility, but it sure is worth it! We have collectively banked three decades worth of trust in the web as a stable place to build a home. Let’s not blow it.

If you published a website ten or twenty years ago, and you didn’t use any proprietary technology but only stuck to web standards, you should rightly expect that site to still work today …and still work ten and twenty years from now.

There was something else that bothered me about that tweet and it’s not something that I saw mentioned in the responses. There was an unspoken assumption that the web is built by professional web developers. That gave me a cold chill.

The web has made great strides in providing more and more powerful features that can be wielded in learnable, declarative, forgiving languages like HTML and CSS. With a bit of learning, anyone can make web pages complete with form validation, lazily-loaded responsive images, and beautiful grids that kick in on larger screens. The barrier to entry for all of those features has lowered over time—they used to require JavaScript or complex hacks. And with free(!) services like Netlify, you could literally drag a folder of web pages from your computer into a browser window and boom!, you’ve published to the entire world.

But the common narrative in the web development community—and amongst browser makers too apparently—is that web development has become more complex; so complex, in fact, that only an elite priesthood are capable of making websites today.

Absolute bollocks.

You can choose to make it really complicated. Convince yourself that “the modern web” is inherently complex and convoluted. But then look at what makes it complex and convoluted: toolchains, build tools, pipelines, frameworks, libraries, and abstractions. Please try to remember that none of those things are required to make a website.

This is for everyone. Not just for everyone to consume, but for everyone to make.

The state of UX

There is much introspection and navel-gazing in the world of user experience design. More than usual, I mean.

Jesse James Garrett recently said:

I don’t think I know anyone that’s been in UX more than a decade who’s happy with how it’s going.

In a recent issue of the dConstruct newsletter—which you really should subscribe to—I pointed to three bowls of porridge left out by three different ursine experience designers.

Mark Hurst wrote Why I’m losing faith in UX. Too hot!

Scott Berkun wrote How To Put Faith in Design. Too cold!

Peter Merholz wrote Waking up from the dream of UX. Just right!

As an aside, does it bother anyone else that the Goldilocks story violates the laws of thermodynamics?

Anyway, this hand-wringing around the role of UX today seemed like a suitably hot topic for one of our regular roundtable chats at Clearleft. We invited Peter along too and he was kind enough to give us his time.

It was a fun discussion. Peter pointed out that whenever he hears an older designer bemoaning the current state of design, he has to wonder what’s happened in their lives to make them feel that way (it’s like when people complain about the music of today and how it’s not as good as the music of whatever time period I was a teenager). And let’s face it, the good ol’ days weren’t so good for everyone. It was overwhelmingly dominated by privileged white dudes. The more that changes, the better …and it needs to change far, far more.

There was a general agreement that the current gnashing of teeth isn’t unique to UX. It’s something that just about any discipline will inevitably go through. Peter’s epiphany was to compare it with the hand-wringing around Agile:

The frustration exhibited with the “dream of UX” is (I think) identical to the frustration the original Agile community sees with how it has been industrialized (koff-SAFe-koff).

Perhaps the industrialisation of what once a cottage industry is the price of success. But that’s not necessarily bad, as long as you industrialise the right things. If UX has become the churning out of wireframes at scale, then something has gone very wrong. If UX has become the implementation of dark patterns at scale, then something has gone very wrong.

In some organisations, perhaps that’s exactly what’s happened. In which case, I can totally understand the disillusionment. But in other places, I see the opposite happening. I see UX designers bringing questions of ethics to the forefront. I see UX designers—dare I say it?—having their proverbial seat at the table.

Chris went so far as to claim that we are in fact in a golden age of user experience design. Controversial! But think about it, he said. Over the next few days, pay attention to interactions you have with technology, and consider the thought and skill that has gone into them.

I had Chris’s provocation in mind when I wrote about booking my vaccination appointment:

I just need to get in, accomplish my task, and get out again. This is where the World Wide Web shines.

Maybe Chris is right. Maybe the golden age of UX is here. It’s just not evenly distributed. Yet.

It’s an interesting time for the discipline of user experience design. I’ve always maintained that the best way to get a temperature check for your chosen field is to go to a really good conference. If you’re a UX designer and you want to understand the state of the UX nation, you should get a ticket for the online UX Fest in June. See you there!

Switching

Chris has written about switching code editors. I’m a real stick-in-the-mud when it comes to switching editors. Partly that’s because I’m generally pretty happy with whatever I’m using (right now it’s Atom) but it’s also because I just don’t get that excited about software like this. I probably should care more; I spend plenty of time inside a code editor. And I should really take the time to get to grips with features like keyboard shortcuts—I’m sure I’m working very inefficiently. But, like I said, I find it hard to care enough, and on the whole, I’m content.

I was struck by this observation from Chris:

When moving, I have to take time to make sure it works pretty much like the old one.

That reminded me of a recent switch I made, not with code editors, but with browsers.

I’ve been using Chrome for years. One day it started crashing a lot. So I decided to make the switch to Firefox. Looking back, I’m glad to have had this prompt—I think it’s good to shake things up every now and then, so I don’t get too complacent (says the hypocrite who can’t be bothered to try a new code editor).

Just as Chris noticed with code editors, it was really important that I could move bookmarks (and bookmarklets!) over to my new browser. On the whole, it went pretty smoothly. I had to seek out a few browser extensions but that was pretty much it. And because I use a password manager, logging into all my usual services wasn’t a hassle.

Of all the pieces of software on my computer, the web browser is the one where I definitely spend the most time: reading, linking, publishing. At this point, I’m very used to life with Firefox as my main browser. It’s speedy and stable, and the dev tools are very similar to Chrome’s.

Maybe I’ll switch to Safari at some point. Like I said, I think it’s good to shake things up and get out of my comfort zone.

Now, if I really wanted to get out of my comfort zone, I’d switch operating systems like Dave did with his move to Windows. And I should really try using a different phone OS. Again, this is something that Dave tried with his switch to Android (although that turned out to be unacceptably creepy), and Paul did it ages ago using a Windows phone for a week.

There’s probably a balance to be struck here. I think it’s good to change code editors, browsers, even operating systems and phones every now and then, but I don’t want to feel like I’m constantly in learning mode. There’s something to be said for using tools that are comfortable and familiar, even if they’re outdated.

Choosing tools for scaling design

Tools and processes are intertwined. A company or a department or an individual has a way of doing things—that’s the process. They also have software to carry out the process—those are the tools.

Ideally, they should be loosely coupled. You should be able to change your tools without necessarily changing your process. So swapping out, say, one framework or library for another shouldn’t involve fundamentally changing the way you work. Likewise, trying a new way of working shouldn’t require you to use unfamiliar tools.

When it comes to scaling design within organisations, the challenges are almost always around switching processes (well, really it’s about trying to change culture, but that starts with changing processes—any sufficiently advanced process is indistinguishable from culture). All too often, though, I see people getting hung up on the tools.

We need to get more efficient in how we deliver designs …so let’s switch over to this particular design tool.

We should have a design system …so let’s get everyone using this particular JavaScript framework.

I understand this desire to shortcut the work of figuring out processes and jump straight to production solutions. For one thing, it allows you to create an easy list of requirements when it comes to recruiting talent: “Join our company—you must demonstrate experience and proficiency in this tool or that library.”

But when tools and processes become tightly coupled like this, there’s a real danger of stagnation. If a process can be defined as “the way we do things around here”, that’s not something you want to tie to any particular tool or technology. Otherwise, before you know it, you’re in the frustrating situation of using outdated tools, but you can’t swap them out for newer or better-suited technologies without disrupting everyone’s work.

This is technical debt (although it applies just as much to design). You’re paying a penalty in the present because of a decision that somebody made in the past. The problem isn’t so much with the decision itself, but with the longevity of its effects.

I think it’s important to remember what a tool is: it’s a piece of technology that enables you to work faster or better. You should enjoy using your tools, but you shouldn’t be utterly dependent on any particular one. Otherwise, the tail starts wagging the dog—you are now in service to the tool, instead of the other way around.

Treat your tools like cattle, not pets. Don’t get too attached to any one technology to the detriment of missing out on others.

Mind you, if you constantly tried every single new tool or technology out there, you’d never settle on anything—I’m pretty sure that three new JavaScript frameworks have been released since you started reading this paragraph.

The tools you choose at any particular time should be suited to what you’re trying to accomplish at that time. In other words, you’ve got to figure out what you’re trying to accomplish first (the vision), then figure out how you’re going to accomplish it (the process), and only then figure out which tools are the best fit. If you jump straight to choosing tools, you could end up trying to tighten a screw with a hammer.

Alas, I’ve seen plenty of consultants who conflate strategy with tooling. They’re brought in to solve process problems and, surprise, surprise, the solution always seems to involve purchasing the software that their company sells. I’ve been guilty of this myself: I see an organisation struggling to systemise their design patterns, and I think “Oh, they should use Fractal!” …but that’s jumping the gun. They might be better served with something simpler, or something more complex (I mean, Fractal is very, very flexible but it’s still just one option—there are plenty of other pattern library tools out there).

Once you separate out the tools from the process, there’s an added benefit. Making the right technology choice is no longer a life-or-death decision. You can suck it and see. Try out the technology and see if it works. If it’s working, great! Carry on using it. If it’s not working, that’s okay too. Try something different.

I realise I’m oversimplifying things, but I honestly believe that the real challenge is not choosing the right tools, but figuring out the right process for your team.

Process and culture

Cameron has a bone to pick. Why, oh, why, he wonders, are we so quick to create processes when what we really need is a good strong culture?

Strong culture = less process

To stop people breaking stuff: make a process for it. Want to make people act responsibly: make a process for it. Tired of telling people about something? Make a process for it.

For any single scenario you can name it’ll be easier to create a process for it than build a culture that handles it automatically. But each process is a tiny cut away from the freedom that you want your team to enjoy.

I take his point, but I also think that some processes are not only inevitable, but downright positive. There should be a process for handling payroll. There should be a process for handling promotions. Leaving that to culture might sound nice and nimble, but it could also lead to unintentional bias and unfairness.

But let’s leave those kind of operational processes aside and focus on process and culture when it comes to design and engineering. Cameron’s point is well taken here. Surely you want people to just know the way things are done? Surely you want people to just get on with doing the work without putting hurdles in their way?

On the face of it, yes. If you’re trying to scale design at your organisation, then every extra bit of process is going to slow down your progress.

But what if speed isn’t the most important metric of success when it comes to scaling design? You’ve got to make sure you’re scaling the right things.

Mark writes:

This is a post in defence of process. Yes, I know what you’re thinking: ‘urgh, process is a thing put in place to make up for mediocre teams’; or ‘prioritise discussion over documentation’; or ‘I get enough red tape in other parts of my life’.

The example he gives is undeniably a process that will slow things down …deliberately.

Whenever someone asks me to do something that I think seems ill-conceived in some way, I ask them to write it down. That’s it. Because writing is high effort. Making sentences is the easy bit, it’s the thinking I want them to do. By considering their request it slows them down. Maybe 30% of the time or something, they come back and say ‘oh, that thing I asked you to do, I’ve had a think and it’s fine, we don’t need to do it’.

I’ve seen this same tactic employed in standards bodies. Somebody bursts into a group and says “I’ve got a great idea—we should make this a thing!” The response, no matter what the idea is, is to say “Document use-cases.” It’s a stumbling block, and also a bit of a test—if they do come back with use-cases, the idea can be taken seriously; the initial enthusiasm needs to be backed up with hard graft.

(On a personal level, I sometimes use a little trick when it comes to email. If someone sends me a short email that would require a long response from me, I’ll quickly fire back a clarifying question: “Quick question: did you mean X or Y?” Now the ball is back in their court. If they respond swiftly with an answer to my question, then they’ve demonstrated their commitment and I honour their initial request.)

Anyway, it sounds like Cameron is saying that process is bad, and Mark is saying process can be good. Cody Cowan from Postlight thinks they’re both right:

To put it bluntly: people, not process, are the problem.

Even so, he acknowledges Cameron’s concern:

One of the biggest fears that people have about process is that something new is going to disrupt their work, only to be replaced by yet another rule or technique.

I think we can all agree that pointlessly cumbersome processes are bad. The disagreement is about whether all processes are inherently bad, or whether some processes are not only necessary, but sometimes even beneficial.

When Cameron talks about the importance of company culture, he knows whereof he speaks. He’s been part of Canva’s journey from a handful of people to hundreds of people. They’ve managed to scale their (excellent) culture along the way. That’s quite an achievement—scaling culture is really, really challenging. Scaling design is hard. Scaling culture is even harder.

But you know what’s even more challenging than scaling culture? Changing culture.

What if your company didn’t start with a great culture to begin with? What if you’re not Canva? What if you’re not AirBnB? What are your options then?

You can’t create a time travel machine to go back to the founding of the company and ensure a good culture from the outset.

You can’t shut down your existing company and create a new company from scratch, this time with a better culture.

You’ve got to work with what you’ve got. That doesn’t mean you can’t change your company culture, but it’s not going to be easy. Culture is pretty far down the stack of pace layers—it’s slow to change. But you can influence culture by changing something that’s less slow to change. I would argue the perfect medium for this is …process.

Once you know what values you’re trying to embed into your culture, create processes that amplify and reward those values. I totally understand the worry that these processes will reduce autonomy and freedom, but I think that only applies if the company already has a strong culture of autonomy and freedom. If you’re trying to create a culture of autonomy and freedom, then—as counter-intuitive as it may seem—you can start by putting processes in place.

Then, over time, those processes can seep into the day-to-day understanding of how things are done. Process dissolves into culture. It’s a long game to play, but as Cameron points out, that’s the nature of culture change:

Where culture pays off is in the long run. It’s hard work: defining the culture, hiring for the culture and communicating the culture again, and again, and again. But if you want to make a company where people are empowered, passionate, and champions of your organisation then it’s the only path forward.

Where to start?

A lot of the talks at this year’s Chrome Dev Summit were about progressive web apps. This makes me happy. But I think the focus is perhaps a bit too much on the “app” part on not enough on “progressive”.

What I mean is that there’s an inevitable tendency to focus on technologies—Service Workers, HTTPS, manifest files—and not so much on the approach. That’s understandable. The technologies are concrete, demonstrable things, whereas approaches, mindsets, and processes are far more nebulous in comparison.

Still, I think that the most important facet of building a robust, resilient website is how you approach building it rather than what you build it with.

Many of the progressive app demos use server-side and client-side rendering, which is great …but that aspect tends to get glossed over:

Browsers without service worker support should always be served a fall-back experience. In our demo, we fall back to basic static server-side rendering, but this is only one of many options.

I think it’s vital to not think in terms of older browsers “falling back” but to think in terms of newer browsers getting a turbo-boost. That may sound like a nit-picky semantic subtlety, but it’s actually a radical difference in mindset.

Many of the arguments I’ve heard against progressive enhancement—like Tom’s presentation at Responsive Field Day—talk about the burdensome overhead of having to bolt on functionality for older or less-capable browsers (even Jake has done this). But the whole point of progressive enhancement is that you start with the simplest possible functionality for the greatest number of users. If anything gets bolted on, it’s the more advanced functionality for the newer or more capable browsers.

So if your conception of progressive enhancement is that it’s an added extra, I think you really need to turn that thinking around. And that’s hard. It’s hard because you need to rewire some well-engrained pathways.

There is some precedence for this though. It was really, really hard to convince people to stop using tables for layout and starting using CSS instead. That was a tall order—completely change the way you approach building on the web. But eventually we got there.

When Ethan came out with Responsive Web Design, it was an equally difficult pill to swallow, not because of the technologies involved—media queries, percentages, etc.—but because of the change in thinking that was required. But eventually we got there.

These kinds of fundamental changes are inevitably painful …at first. After years of building websites using tables for layout, creating your first CSS-based layout was demoralisingly difficult. But the second time was a bit easier. And the third time, easier still. Until eventually it just became normal.

Likewise with responsive design. After years of building fixed-width websites, trying to build in a fluid, flexible way was frustratingly hard. But the second time wasn’t quite as hard. And the third time …well, eventually it just became normal.

So if you’re used to thinking of the all-singing, all-dancing version of your site as the starting point, it’s going to be really, really hard to instead start by building the most basic, accessible version first and then work up to the all-singing, all-dancing version …at first. But eventually it will just become normal.

For now, though, it’s going to take work.

The recent redesign of Google+ is true case study in building a performant, responsive, progressive site:

With server-side rendering we make sure that the user can begin reading as soon as the HTML is loaded, and no JavaScript needs to run in order to update the contents of the page. Once the page is loaded and the user clicks on a link, we do not want to perform a full round-trip to render everything again. This is where client-side rendering becomes important — we just need to fetch the data and the templates, and render the new page on the client. This involves lots of tradeoffs; so we used a framework that makes server-side and client-side rendering easy without the downside of having to implement everything twice — on the server and on the client.

This took work. Had they chosen to rely on client-side rendering alone, they could have built something quicker. But I think it was worth laying that solid foundation. And the next time they need to build something this way, it’s going to be less work. Eventually it just becomes normal.

But it all starts with thinking of the server-side rendering as the default. Server-side rendering is not a fallback; client-side rendering is an enhancement.

That’s exactly the kind of mindset that enables Jack Franklin to build robust, resilient websites:

Now we’ll build the React application entirely on the server, before adding the client-side JavaScript right at the end.

I had a chance to chat briefly with Jack at the Edge conference in London and I congratulated him on the launch of a Go Cardless site that used exactly this technique. He told me that the decision to flip the switch and make it act as a single page app came right at the end of the project. Server-side rendering was the default; client-side rendering was added later.

The key to building modern, resilient, progressive sites doesn’t lie in browser technologies or frameworks; it lies in how we think about the task at hand; how we approach building from the ground up rather than the top down. Changing the way we fundamentally think about building for the web is inevitably going to be challenging …at first. But it will also be immensely rewarding.