Off Script: Designing the future with AI

The emergence of modern AI is prompting us to rethink how we approach the very essence of product design.

This is what we’re exploring in the latest episode of Off Script, our series of candid conversations with Intercom leaders about the extraordinary technological shift being driven by AI.

There’s a certain irony in the urge to reflect on the past when confronted with the possibilities of an unknown future, and the latest advances in generative AI have certainly got us thinking – not just about the changes that are coming, but also about what they mean for our roles and our craft.

Every new wave of technology has brought about transformative changes in how users interact with digital systems and devices – from the earliest computers with punch cards to modern icons on touchscreens – and with each new leap, our interfaces have become more accessible, more intuitive, and more aligned with the way we communicate.

In this episode of Off Script, our VP of Product Design, Emmet Connolly, talks about how AI is fundamentally changing the way we think about building interfaces, the role of software designers, and what the future may hold as we forge ahead.

Here are some key takeaways from the episode:

  • As technology advances, user interfaces become less abstract and increasingly mirror human communication.
  • In the early stages of new technology, interfaces often utilize familiar design metaphors to help user adoption, but UIs always evolve in ways that are truly native to the underlying technology.
  • Moore’s Law allowed us to predict the trajectory of computer speed over time, but AI capabilities are advancing so fast that we can’t predict what next year’s iteration will look like.
  • Soon, AI may be able to simplify complex user interactions by generating tailored, intuitive, and frictionless interfaces for each specific use case.
  • AI’s commodification of basic tasks will not only raise the standards for acceptable design but also allow designers to focus on higher-level challenges that elevate their craft.
  • Working with AI in design comes with unpredictability, breakneck speed, creative chaos, and plenty of iteration. Embrace it.

We publish new Off Script episodes on the second Thursday of every month – you can find them right here or on YouTube

What follows is a lightly edited transcript of the episode.


Off Script: Episode 4
Emmet Connolly on Designing the Future with AI

Eoghan McCabe: Intercom is a strange and unique company. We’re a large, late-stage software firm that’s also bet everything on AI. You typically only find early-stage startups that are as all-in on AI in the way that we are. This means that we have a depth of expertise with building excellent software. We’ve done that for over a decade, we’re known for it. But we’re learning that, while that’s served us well in the past, we’re going to need to rethink our approach for this AI future.

In this episode of Off Script, Emmet Connolly, our longtime head of Design at Intercom and a core contributor to every strategic decision we make here, looks back on where we’ve come from and forward to where we’re going. He explains how the advent of modern AI has us revisiting everything – from how we design and build interfaces, all the way to fundamentally rethinking our products and the role of software designers here too.

This is us very much sharing what we’re learning at scale as we pivot in the hopes that it’ll be valuable to you as you start to do the same. I hope you enjoy it.

The evolution of interfaces

Emmet Connolly: Those that do not learn from history are doomed to repeat it. If you look at the past, you can often pattern-match from previous waves of technology, or you can maybe plot the course of change and extrapolate and try and predict what the future is going to hold.

“Every new wave of technology has led to new user interfaces, and the technology often unlocks some new user interface innovation”

Marshall McLuhan called this “looking at the present through a rearview mirror.” Looking through a rearview mirror is a somewhat limiting frame of reference – so you can’t always take those frameworks from the past and just apply them to what’s going to happen next.

The rest of the McLuhan quote is “And in doing so we walk backwards into the future,” and I think that’s a good way of thinking about how products might change as a result of AI. There’s a lot we can learn from looking back, but there’s also this sense a lot might be different this time around.

Every new wave of technology has led to new user interfaces and the technology is often an unlock to some new user interface innovation. If you look back into the 19th century and think about where the telegraph came from, there were solutions for long-distance communication at the time, and a lot of them were not that great. They were things like sending mail by train or pony express or whatever, and they had some limitations. When electricity and an understanding of magnetism came along, that was the unlock that allowed the telegraph to be invented and solve the problems of a lot of the previous user interfaces for communicating over long distances.

“The user interface becomes less abstracted away, less the language of computers, and more the language of humans”

But some of the inherent limitations in this new thing become readily apparent very quickly as well. You require miles and miles of cabling, so it can only be done between a very specific point A and point B, and you require experts at each end to encode and decode. Almost as soon as a new technology unlocks a new interface, the limitations of that interface become apparent and we wait for the next unlock to happen. The limitations of the telegraph are what led to radio; limitations of radio led to television.

And we’ve seen this in the computer age as well: from mainframe computers leading to personal computers with command line interfaces, which led to graphical user interfaces, which led to multitouch interfaces.

You notice with each of those new hops you make, the user interface becomes less abstracted away, less the language of computers, and more the language of humans. I think the thing that has really blown people’s minds with AI arriving on the scene is it is, by far, the most – almost to an uncanny sense – human-like form of technology that we have encountered so far, and that’s because of the human-like language aspect of it.

Making sense of new technology

In the very earliest stages of a new technology, we tend to grasp on to familiar metaphors as a way of engaging with the new technology initially. A keyboard is a keyboard because of typewriters. With the desktop operating system, the earliest versions looked like the surface of a desk, and that’s why it’s called a desktop. Early iPhone apps had this skeuomorphic look, which was a very tactile look and feel to help people understand they could touch and interact with these things by direct manipulation. There is always an evolution from that point where we will figure out, “Oh, what is the more sophisticated, expressive, native form to the new technology that this can take?”

You see this in all sorts of media through time. If you look at early cinema, for example, the very first silent movies were like, “Take a camera, point it at a theater stage, and record a play.” It was a thing that people were very familiar with, and the early versions that started to move beyond that, in some sense, felt unfamiliar to people at first. There’s the famous example of the Lumière brothers’ train pulling into a station, coming fast towards the camera and the audience running terrified from the theater because they thought it was going to come out. But they get the hang of it soon.

“The very familiar user interface of messaging is a handy metaphor to apply to AI and a great starting point, but it almost certainly won’t be the end point”

The new language co-develops, in a sense, between the makers and the audience. You had filmmakers like Eisenstein experimenting with editing and developing, with the audience, a shared language of cinema, so today, you can look at a closeup or a reaction shot and you understand, from a storytelling point of view, what those things mean. You can even go further and start to subvert it and do non-linear storytelling and the audience will come with you on that journey.

When we think of AI user interface today, we mostly think of chat apps like ChatGPT, and that is because the very familiar user interface of messaging is a handy metaphor to apply to AI and a great starting point, but it almost certainly won’t be the ending point. We will continue to evolve from here.

Expect the unexpected

We have often been able to look at the past and do that pattern matching thing. One example of that is Moore’s Law. Moore’s Law says the number of transistors on an integrated circuit doubles every couple of years, which is a nerdy way of saying that in a very linear, steady way, computers get faster every couple of years.

That has meant that we have been able to extrapolate and figure out, with a fair degree of certainty, how fast computers are going to be a couple of years from now and plan accordingly. The same doesn’t hold for AI at all. We have had so much new capability technologically available to us very suddenly that there’s a lot more juice we can squeeze out of that in terms of inventing products and new ways of doing things even with today’s capabilities.

“It is easier to predict what will go away than to predict the precise solution that’s going to replace it”

There’s probably also an adoption overhang. The emergence of the model capabilities is a lot more like “put a bunch of information into the thing and seeing what comes out of it”. It’s a lot more discovery than invention, and even the makers of the models can’t predict what next year’s iteration of the models is going to look like. The pace of change is very, very fast. That degree of uncertainty is making a lot of people nervous.

It is easier to predict what will go away than it is to predict the precise solution that’s going to replace it. We have been building customer service software for over a decade, and a lot of what you do is you chip away, year after year, to try to improve existing workflows that are there, but there are a lot of things that almost seem like immovable objects.

“Very soon, hearing Greensleeves on your phone is going to sound as outdated and anachronistic as a dial-up modem sound sounds to us today”

In customer service, that might be a massive inbound volume of questions from your customers to deal with, or your product might have a certain degree of real deep complexity that you have to know a lot about in order to answer the full range of questions that might come into you. And AI has the potential to change that. For example, the amount of inbound volume doesn’t necessarily need to be a thing that you can’t get away from because a certain amount of that can be dealt with via AI.

It’s almost a meme or a joke about customer service that you’re going to spend an hour listening to a tinny, 20-second loop of Greensleeves on your phone while you’re waiting for someone to pick up on the other end. Very soon, hearing Greensleeves on your phone is going to sound as outdated and anachronistic as a dial-up modem sound sounds to us today.

I saw this little website recently – you can put in your Twitter username and it tells you what your first tweet was. My first tweet was in November 2006 – that’s a long time ago. It was something like, “I’m sitting here on hold. For a list of ways in which technology has failed to improve your life, press three now.”

Some things feel like they never change in technology if you take a long view back, but maybe, if you wait around long enough, you’ll eventually have the chance to make a change.

AI-generated user interfaces

Anyone who uses software or products today will probably have this feeling, “Man, this thing is really complex, why is it so complicated?” The answer is not, in most cases, because the designer was really bad at their job – it was because the nature of the products we use is that they often have to serve a very diverse user base.

If you think about how our cities are designed, do you optimize for the cyclist or the driver? And the answer is, both of them are legitimate users of your city and you have to try and serve both of them. But what you often end up with is a, if you pardon the pun, a middle-of-the-road solution.

“You can have a nice, clean, simple UI that is custom generated at the moment for the precise use case it’s needed for”

How can AI solve this complexity problem? Generative AI is really good at generating stuff. The early versions we saw was text, and then we saw lots of images being generated and, more recently, we’re seeing audio and video being generated with pretty plausible-looking results. So it is therefore not a massive leap of the imagination to imagine user interfaces being generated on the fly. Instead of these general purpose but somewhat compromised tools that get bogged down in complexity, you can have a nice, clean, simple UI that is custom generated at the moment for the precise use case, for the precise user, it’s needed for.

Imagine you contact a company and let them know you want to change your billing information. Now, if we just use a chat UI and do that entirely through AI, it’s still pretty laborious to type “I want to change my billing information.” And then you get a text reply and a chat bubble, and it says, “Cool, what’s your new address?” And you type your address, and then it says, “Oh, you left out your postcode, what’s that?” And you type that and then it says, “Is your billing address the same as your postal address?”

It is not a very frictionless user interface. Instead of that, you can imagine a generative UI version where you say “I want to update my billing information,” and you might just get a simple little widget that has a couple of form fields to fill out and a checkbox.

The advantage of that is it has a lot of properties of good user interfaces – it tells you what you can do instead of being faced with a blank text box so you’re left wondering what you can even put in there.

The high costs of friction

I used to work at Google and I learned something quite interesting about friction in user interfaces. I was working on the search results page and we artificially slowed down how long it took for the search results page to load, but only by a minute amount, something like 40 milliseconds.

Nobody complained and nobody wrote in, but their behavior actually changed. The people who had the artificially slower results, even though they were below conscious human perception, that cohort of people searched less frequently on a daily basis.

Something deep in the crevices of their brain made this kind of trade-off and said, “Nah, it’s not quite worth it to perform that search because of the friction involved.” Choosing the right modality, the one that provides the least friction for getting an answer or performing a task, should always be something designers are thinking about.

The AI road ahead

These big disruptive ways of technology don’t simply leave existing workflows as they are and reduce friction. They will tend to turn over the whole apple cart and upset things in a much more obvious way.

We’ve seen a lot of “AI sparkle emoji features”, where you have a button and add a little AI sparkle emoji to it and it does that thing automatically instead of manually, but that really is, in many cases, just the starting point for thinking about how AI will impact the design of your product. You’ve got to have a sense of where you’re going, whether this is a small set of optimizations or a complete rethink of your product category.

“We have kind of a framework both for us, as people who develop products, and for our customers, as the people who want to have a sense of where this is going”

Self-driving cars are a good analogy here. The industry actually defined levels of self-drivingness from level zero to level five, where level zero is no automation, level one is small amounts of automation, like cruise control, and all the way up to level five, which is full self-driving.

If you go to San Francisco today, you see empty cars driving themselves around the streets. Again, if you go through enough of those levels, you’ll eventually get to where you’re trying to figure out where you’re going.

We’ve done something similar like this and tried to develop a kind of maturity framework for AI adoption in customer service, going all the way from what you might think of as level zero – no automation, just humans sitting there typing responses to customers all day – to what you might think of as level five, which we are certainly not at yet, of AI agents resolving 100% of customer queries.

But at least, we have kind of a framework both for us, as people who develop products, and for our customers, as the people who want to have a sense of where this is going and is this the path they want to be on.

The human touch endures

It is almost certain that Figma, a very common tool that designers use to create their UIs today, will develop the capability and a box that you can type into and say, “Make me a UI for an iPhone app for a dog walking service.” And hey presto, you get it and it’s rendered there.

Ed note: Since Emmet recorded this episode, Figma launched this feature, and then removed it.

What’s your reaction to that development? Is it, “Oh shit, I’m about to be put out of a job,” or is it, “Oh, that’s actually pretty interesting and exciting because that is the least important aspect of my job today”?

I think the drawing rectangles aspect of the role, frankly, is the most junior designer aspect of the role and one that is very time-consuming. Most designers want to spend most of their time on other things like problem-solving, coming up with ideas, being creative, and offloading some of the more manual work aspects of the role for AI to handle will hopefully free us up to spend more time on those more creativity-oriented things.

“Human-centered design will still have a role in pushing culture forward, pushing the state of the art forward, and pushing taste forward”

If you look at design today, there is a very broad element of sameness to a lot of stuff. All the websites look the same, all the smartphones look the same, all the cars look the same, the Airbnbs you stay in look the same, the hipster coffee shops look the same, the movie posters look the same. There is a lot of sameness, and this has been referred to as the “age of average”.

All design exists on a spectrum from cheap and fast to expensive and high-quality. And if AI can commodify the cheap and fast end of that spectrum, it raises the floor for what we would expect the worst design to be. It also raises the ceiling for what designers can do with their craft and the level of taste and creativity they can apply there. Once the creation of the same-y design artifacts does become commodified, it creates an opportunity for the bespoke, artisanal, thoughtful, opinionated, taste-oriented, more expensive, whatever.

“Even if things like AI provide shortcuts to good enough, the human touch becomes more valued”

Culture is always the back-and-forth conversation between what is normal and mainstream, and the response to that. Human-centered design will still have a role in pushing culture forward, pushing the state-of-the-art forward, and pushing taste forward.

If you were a drummer in the early eighties and the 808 came out, you might’ve thought like, “That’s it. We’re hosed,” but in fact, it results in the democratization of who can create music, so you don’t have to become a master at your craft in order to take part. But drummers still exist. There will still be a necessity for people who will master the craft because it is an invaluable learning tool. There is definitely an element of mastery required. There’s an element of learning craft required to get to mastery, and I think that will continue to be true, even if things like AI provide shortcuts to “good enough”. The human touch becomes more valued.

We recently redesigned our website at the Intercom website. We wanted to create this sense of technology and the more humanistic element coexisting, right? So we used these very big, beautiful expressive paintings of scenes to frame our product. On Twitter, someone was like, “What was the prompt that you used to create the paintings?” And I replied, “Oh, no prompt. We commissioned artists that we loved and respected their work and wanted to get the chance to work with them.”

People were completely surprised, and somehow, I think that elevated people’s appreciation of the work we had done further because the human aspect was present in it. That will probably be a trend of how we start to look at design and products going forward.

New tools in the tool belt

New technologies often tend to be additive, more like an additional tool in the tool belt, than some new thing that comes along and wipes away everything that came before it. New technologies tend to have a very long half-life. They stick around for reasons of, well, there’s a lot more things I can do here: Do I want to use a drummer or a drum machine? Do I want to use a generated image or a real image?

Designers have been trying to make this type of change also happen for quite a while. There’s a thing in design called design systems where everyone wants to have a reusable set of components that they can kind of drag and drop into their designs, that you can assemble a design from.

Imagine having the world’’s greatest, most up-to-date, most flexible, and most expressive design system at your fingertips. All that said, I acknowledge that change is hard, but focusing on innate human-centered skills and thinking about design as an act of creativity and invention, and not just an act of artifact creation or production, is a healthy way to think about navigating the change that’s coming.

Embrace the squiggle

There is a way of thinking about design that lots of people learn about in design school, and it’s called the double diamond.

It’s literally two diamonds like this. In the first stage, you explore a whole bunch of ideas divergently and then converge on a solution, one way you’re going to solve it. Then, you explore a bunch of different ways of designing and instantiating that solution, and then you converge on one, and you’ve got your product at the end. That is the theory.

The reality is many people learn when they start designing products for real, and that can be expressed in a different framework for doing design, which is called the squiggle.

And the squiggle is insane, chaotic, mistakes, disagreements, backtracking, changing your mind and changing it back again, and eventually, out of that, you find your way to a solution that you’re happy with. Working with AI and its unpredictability, its fast-changing nature, and its non-deterministic nature, is going to feel a lot more like that latter way of working. And so, the final bit of advice I would give is to embrace the squiggle.

The revolutionary optimism of design

There are many ways in which some of the first- or second-order effects of all of this will be not good for designers or society, and in many ways, it’s our job to try and steer it in the right direction to ensure those things don’t actually happen.

I often think of design as this perfect balance between cranky dissatisfaction and almost revolutionary optimism, and both of those things coexisting in harmony.

You have to be cranky in order to be a designer. You go around the place and you see that terribly kerned signage and that user interface that is not intuitive enough, and you need to be pissed off enough to want to go and fix it. It is actually a really optimistic way of looking at the world to be like, “I could make that better and I’m actually going to go and do that.”

So, for me at least, design is fundamentally an act of optimism. A belief that things can be made better. It’s good to look backwards and see what we can learn from the past. It’s fine to look forward with a somewhat skeptical view of how it could all play out. But it’s probably more important to keep pushing forward with that sense of optimism that we can actually make things better.

Off Script CTA Horizontal