Denial of Research

If you work in UX, it will happen to you. You'll face a situation where someone will give your research proposal a hard no. Even the small, "let's talk with a handful of people" idea you put together... no. Let's commiserate on why this happens.

We know what our users want

The most common rebuttal I've experienced is something along the lines of, "We already know what our users want" or "We know all our stakeholders. Just do this." And solely in my experience, this is almost always not true... but it can give you a firm hint of where someone's mindset is with research.

To that person, research may just mean focus groups: asking people their opinions for multiple hours, paying them money, spending a lot of time on it, and not seeing results. Or, it may mean focus groups that serve as the sole input for a project, above stakeholders and competitive analysis. In these cases, education on different research approaches is going to be a tough road but, again in my experience, is the best way to go.

Now, there have been cases where "We know what our users want" has led me to seeing a raft of research and analysis, too! And that's great. That gives me an opportunity to say, "Hey, there's a little gap here – let's talk about it" or "Yep, I agree with the findings." I value research because it continues to move design out of the realm of opinion, where people are arguing for The Thing They Like above all else.

Let's just use best practices

Fuck best practices.

Seriously.

"Best practices", in my experience, is a shortcut. It means, "I don't want to spend any time doing research." That's okay, maybe! It's totally okay. But what it suggests is a misunderstanding of what design offers: "You've done this 28 times before, so this should be easy. Just do what you did the other 28 times." I would augment that by saying, sure, I may have done this 28 times before... but I've never worked with you before. I want to figure out what's unique about you and your problem, truly. And if I can't uncover that, what I make might not fix a damn thing.

In these cases, there might not be much convincing. But talking through what "best practices" means is essential. If it really does mean "I don't have any time or money to spend", then come up with some ways to get that research and data that are cheap. If you can. If not, call it.

We know this one person, they know everything

Talking with one user and basing your decisions on that person's opinion is a bad idea. Calling it "user research" devalues that.

When this happens, it's often a variation on "we know our users". Often, there's one awesome subject matter expert (SME) a stakeholder pool knows, and they look to that person to be a proxy for users. Now, again, there absolutely are times and have been times when that person has been a proxy: they come with research, they suggest ideas on getting data they don't have, they act as solid partners.

The corollary to that, in my experience, is the person brought in for opinions only. It's a one-person focus group and because that person is so respected and known, it eliminates the worry for people. "We know this person, and they know everything, so we feel good about this." Making plans solely around this is risky as hell, because what if that person is wrong? What if they're bringing just opinions? What if they're... human?

These can be tough situations too. I still recommend listening to that SME, for sure, but framing it around additional research and context that can help get a better understanding of what you're looking to do.

How to convince someone that these are bad ideas

There's no way to do it. Sorry!

...

I kid. A bit. But here's what I've generally found to be helpful.

  1. Education. Some people respond better to this than others, and you'll need to find how to best communicate it all, but talking through the various types of research and what you get out of it – what the value is – can make a difference.
  2. Scale back your plans. Can't do a four-week IA exploration? Yeah, I hear you. Can you still do an online card sort to get some decent data? How about an in-person working session, or an on site day? Look for cheaper ways to get the most useful data, knowing that you're making sacrifices.
  3. Give it time. Let things play out. Bring up questions that naturally arise due to the lack of proper research. See how it fits in or impacts your work. Make suggestions along the way.
  4. Live with it. Not every thing you create will be perfect. Sorry. Perfect is the enemy of the good, and all that. Know and understand your personal standards on making quality shit, and make sure you're living up to that as much as possible on your project.

We Lost Native Apps

This is a work in progress. More to come, including some organization.

Somewhere between the time the web really took off and today, we lost truly native apps for our computers.

I know. I hear you. “I have Spotify on my iPhone! And I have it on my Mac too!” And yes, you are right. Some would rightly point out that native apps are undergoing a renaissance. While this is technically correct (especially thanks to programming languages & frameworks that let people "skin" an app to look like its native OS), there is something lost here. Take Spotify and Netflix for instance. The Spotify interface is practically the same from device to device. Netflix used to support native controls and native interfaces, but instead chose to unify under one interface across all devices as much as possible.

This isn't wholly good or bad, although I have an opinion on it.

So what?

A few decades ago, home computers were partially defined by their varying operating systems and programming languages. AppleBASIC was similar to BASIC 2.0, for instance, but not completely the same. Still, one could get the gist of BASIC on a TI-99/4A and mostly take it to a C64.

Once GUIs really took hold, we saw a lot of differentiation come to market. Mac OS was different than OS/2 which was different than BeOS which was different than Windows which was different than Amiga OS. Computers were bought and sold on the operating system and on the PC side, MS-DOS was mostly running the show. Over time, choices diminished and it was down to Mac OS and Windows.

The web grew in popularity. With it came an emerging common interface for accessing information – something new and different. The web, and a browser around it, started to take hold as the definition of what it meant to interact with a computer. That's notable: it wasn't Mac OS or Windows that instigated this change.

Once web apps came around and started to gain acceptance – around the initial AJAX and JavaScript libraries – the days of native apps became numbered. People started to get used to the idea of accessing things the same way on multiple devices without a local OS getting in the way. This is significant and followed through on the idea of the web being an interface for information. More so than a local OS, more so than anything else. You see this in the lineage: much later, we've got apps like Slack and Spotify that are web apps running in a local OS window, and only show their cracks through bugs or quirks.

Once the need to have a truly locally-influenced app is obviated, so is the need to store information locally. Why would you want your files to be on your home Mac when your work PC is where you needed the information? We collectively shifted over to storing things online, and in the cloud, in places we have no real control over for convenience.

Write Once, Run Anywhere

Some application frameworks promise the idea of writing a single codebase and then having that app work on different OSes naturally – iOS, Android, et al. These distill the operating system down to a series of UI widgets at best, and an aesthetic exercise at worst. These apps end up being native technically (that is, to fulfill a requirement on a checklist), but not holistically. Slack and Spotify, for instance, do not operate quite like any other Windows or Mac OS apps. There are conventions that are similar enough to get the gist, but one can tell they're not exactly like the local OS's built-in apps.

What We Lost

Here's where we're at.

  1. OSes used to have solid standards for interaction models. These were enforced by the manufacturers.
  2. The web has never had solid standards for interaction models, but many have emerged and become de facto standards.
  3. Because the web hasn't had these standards and the web usurped the popularity of local OSes, it influenced the direction of those OSes.
  4. And because of that, apps really don't need to adhere to OS standards anymore. There's no real risk or consequence.

I am not certain if this is ultimately something good nor bad. When I first started writing this I really felt that it was bad – it's something we lost, something we don't have anymore.

But on the flip side, we have the promise (if not the actual implementation) of universal interfaces that work from device to device, which is pretty awesome. The unfortunate part is that those interfaces don't need to be consistent from app to app, which is the real pain here: every app is potentially a new learning curve, a new way of interacting, a new way of processing, a new set of shortcuts, a new combination of gestures. That is what we've lost.

It is a different way of locking one in to an interaction model: if one learns to use Spotify, it isn't necessarily portable to iTunes (eeeesssshhhh) nor is it portable to Finder or Windows Explorer. It's an adjustment and a learning curve, or at least a situational change, every single time.

It would be ideal to say that we interact with our devices in a certain way, and they will work in that certain way no matter what. That yes, you can access the same app on different devices – but it will reflect the place you're using it more than anything else. That would be ideal. I'm not sure how, or if, we should get back there.

Clunk

I've had a few thoughts on computers stewing in my noggin over the past few weeks.

Smartphones

I've come to the conclusion that, as a general form factor, I still prefer laptop computers over smartphones. I know! This surprised me too, given I once thought the iPad would obviate my need for a laptop. But I realized this: there are tasks that are just a giant pain in the ass on my phone. Consuming and reviewing a lot of data, and analyzing that data. Handling multiple tasks. Typing. iOS, especially, is trying to push from a specialized computing platform to a generalized one (see iPad Pro). The Mac, and laptops generally speaking, are just better at some things. I'm deeply concerned about the no-holds-barred push to mobile, where there is less of an opportunity for openness (hi! what browser runs on your iPhone? who runs the App Store?) and creative ideas.

That's not to say there won't be any. But, shit, animoji? And scanning your face? This is what Apple is applying technology to when we still have gaping holes elsewhere in the broader UI? Yeah. Well.

Keyboards v. WIMP

The great Sarah Emerson linked up this rather good rant from @gravislizard about GUIs versus keyboard-driven interfaces. My first instinct was negative because of the lead tweet – I was using computers in 1983, too, and no one can tell me that loading programs from a tape drive on a VIC-20 was faster than opening a damn tab in a browser. I stand by this.

However, my initial "SOMEONE IS WRONG ON THE INTERNET" stance faded as I read through it. It's not something I perfectly agree with, but I largely do. There are salient points and contextual items I want to call out.

  • GUIs were positioned as a way forward specifically because keyboard interfaces were deemed too complex for the broader public. And they were! Commands had to be memorized or looked up in a reference manual. Knowing how to program – even a bit – was a requirement, even if all someone wanted to do was load up a CB simulator. This was and is hard for a lot of people, and popularizing an abstraction layer based on visuals instead of words was one broad design solution created to address that. All commands were made obvious with menus and putting things on screen. Remember: we had keyboard overlays to help us.

  • That all said, the keyboard is a highly adaptable and flexible interface. No interface is perfect but, for many tasks, it can absolutely be faster (especially POS systems, as called out in the thread). Repetitive data entry, rote tasks, et al.

  • As GUIs matured – and thinking about consumer window/icon/mouse/pointer (WIMP) specifically – we got to a point where a large number of items piled up in design debt, both conceptual and functional, and were never fixed. In lieu of fixing them, large companies (Apple, Microsoft, Google) moved to touch-based interfaces. This lack of attention is why windows can still steal focus whilst typing on a modern desktop OS in 2017. It could be fixed; these companies just don't care.

  • People should always strive to make computer interfaces easier for other people. But education fell out of this picture a long time ago, and that put all of the burden on the interface to explain itself. Instead of interfaces becoming simpler and more clear, they became more complex – this is general-purpose computing, so it stands to reason. Our technology nowadays implies it's simple due to a lack of a manual; this is a lie. Manuals are not bad. Interfaces, sometimes, do need to be explained. And "no interface is the best interface" is bullshit.

  • I absolutely agree that interfaces, generally speaking, aren't working as hard as they could for users. Google Maps is a great example called out in the thread. Mapping software on phones and computers, generally, stagnated. I suspect this is because mapping in those contexts is seen as a byproduct of navigation, which is something that can be automated (and is, and will be). Ostensibly, these products support other use cases for maps (exploration, education), but it's obvious that's not what they're designed for.

  • General-purpose computing is missing and has missed a lot of opportunities. It's shifted to watching television, reading the news, communicating with people, and photo/video. There's a lot of untapped potential that is constrained by interfaces and the state of the industry.

I'm not completely pessimistic here. I think there's a lot of places computing can go, but the current setup of Apple and Google running the show, effectively, has severely constrained new ideas and cast aside the spirit of tinkering and exploring that existed in the industry 30-40 years ago. It feels more and more like there's just one path forward for computing products from a consumer perspective – which is false and terribly unimaginative.

Things that give me hope in interface design and general-purpose computing? Pfft, us finally collectively talking about ethics and realizing that a lot of design decisions made over the past few decades were really bad ones. That helps me imagine a more just world, a more equitable world where technology is used for good and serves genuine needs.

I'm also holding out a tiny sliver of hope for the World Wide Web because the non-commercial web still exists at all, and that's a good thing. Despite the increasing complexity of front-end code (trying to put a size 12 foot into a size 5 flat), it's still relatively easy to put things on the web. That remains powerful and democratic. That remains exciting. And that's a case where, when used well, technology can bring us together.

Editor's note: This is a new format I'm trying for posts – much more drafty, less polished. Although it may not appear this way, it has been edited since first published.

 

You're Doing it Right: A Collection of Short Essays on UX

Essay One: Much UX education is really UI education

UX education, as a whole and with exceptions, is relying on getting people up to speed on tools in order to convey a sense of competency and achievement. This is problematic for a multitude of reasons. First, it positions UX solely as a UI (and IxD, if you're lucky) exercise. It focuses mostly on the visual interface, because many of the tools out there that are "for" UX are UI tools. Thus, these tools and educators that follow this model are defining user experience as interaction design. This is not inherently bad, but it is rather short-sighted. This allows students to learn about an interface without context, a hierarchy without an overriding taxonomy, a series of interactions that look cool but can alienate.

This focus on the interface typically occludes asking why said interface is being created in the first place; it doesn't encourage deep critical thought beyond the site- or app- specific interactions. Yes we can question why the hierarchy of a product details page is the way it is; no we can not question why this product details page is selling something no one actually wants. (This is a simple and extreme example, but, still gets to the point.)

What goes into a "proper" UX education? Great question. I don't have all the answers. But ideas are free so: the liberal arts, the humanities, interaction design, architecture, information architecture, systems design, interface design, research techniques and methodologies, visual design, typography, writing, editing, communication, industrial design, ethics (ethics, ethics, ETHICS), software design, critique, persuasion, salespersonship.

That's probably not even all of it. And not every position will need all those skills, anyway. But I would personally much rather work with and hire people who have a broad base of understanding, because I would also see them figuring out the screens later. Designing a good screen is a neat thing, and important, but it is not the only thing.

Essay Two: Certification is a good idea

I have been asked candidly by people looking to get into UX if certain programs or certifications are "worth it". My answer is almost always no. But I absolutely see the value in establishing practices and standards; there simply isn't a high, broadly-shared consensus of what those items should be out there yet. It's a chicken and egg problem: if there aren't standards around UX (and there aren't), how can one know if one is hiring a qualified designer? (Yes, insert your answer here.)

It would be pretty amazing, as a hiring manager, to know that I could hire Michael and know she has been accredited to be a UX designer. Or, seeing that Barry is on their way to becoming accredited. It would help me quickly get an understanding of where they are in their careers and where they are headed, prior to an in-depth interview. After all, one could have 10 years of experience "in UX" and not have done a lick of research, or had a hard presentation. That same person could be a freaking UI code genius, which may not be a skill I need.

This all said, I must acknowledge that this is exclusionary. I'm okay with this, with a caveat: any certification program must put diversity first, because the entire field and industry benefits from it. This is not an easy thing to figure out, and no one organization, school, or governing body can go it alone.

Essay Three: Bold, ignorant declarations hurt our discipline

I joked earlier this year about the rotating series of topics within UX, suggesting that there was a new topic every week but they ultimately rotated like a carousel (hey! carousels!) and everyone got mad and upset and then went on. And that's proven out. I don't think it's inherently bad; rather, it's symptomatic of a lack of agreed-upon standards and practices for our industry. When people write hot takes about how UX is dead because they just heard about research, or how everyone is doing this thing wrong, or how user experience was invented 22 years ago, it is ignorant. I don't think it's done maliciously; it's mostly due to a lack of context which, hilariously, is like the most important thing in UX work, for goodness' sake.

But where is that context obtained? There are publications. There are websites. There are books. There are organizations. But there isn't one place that serves as the entry point, housing our shared history and how this work traces back well before 1995 (for instance). There's a danger here, as history is written by the victors, so the values and contributions of people who are not cis white men must be given its proper credit. As I asked in a series of questions on Twitter a while back, who is telling these historical stories? Where are the stories of the women driving UX forward? Where are the stories of the queer people who make IxD better? I want those stories. And anyone learning UX, I would hope, would too.

Without it, we end up with big essays that get a ton of [insert stupid-metric-du-jour], and fade from our collective memories in moments. It hurts our discipline.

Essay Four: You're doing it right

Until we get a more cohesive agreement and view on our work, we're kind of all winging it. We develop processes. We steal good ones. We toss away bad ones. We make up shit. We research and try and research and fail and research and succeed. We develop a real curriculum. We create structure where there's a mess. We question if UX is real or not. We educate ourselves and our clients. We are creating our own silos – which is why a shared knowledge is more critical than ever. But the most important message here is that you are doing it right. You're making the best decisions you can given the context. And by that measure, you're doing just fine. Keep going.

 

 

IA Summit 2017: Watering the Seeds

Note: this article reflected my feelings on IA Summit at the time of attendance. As of 2018, I've learned of serious safety violations at the conference over many years. Please consider that when reading this, as my experience may not be typical. Until further notice, you should not attend the IA Conference (which is what IA Summit is now known as) or support the IA Foundation. – Ed.

Something was different this year.

Last year I shared a concern I feel every year at IAS on whether the conference, this one that I've now attended for five years with a community of people I love, will not feel special.

I echo the sentiments expressed by the great Dylan Wilbanks, whom I met at IAS this year after the polite prodding of everyone we know. Our field, and this conference, is now feeling the effects of a generational shift. When I started in UX, there was no playbook on how to do the work and what was expected; it arose out of the “webmaster” term (as did dozens of other practices) as we learned that our work was more complex and needed more purpose, more focus.

And so I didn't feel the same this year as I did the prior four, not quite. Instead I felt more invested in other people. I wanted to ensure that my first timers (the group that came with me and Andy Fitzgerald to dinner the first night – and the informal group that came with me and Bibi Nunes the second night) were having a good time, and getting the most out of it. I wanted new speakers to know that they were respected, heard, and that the community had their backs. I wanted my friends to know I was there, in presence and in spirit both.

Now. Talks and keynotes were, again, solid. I particularly enjoyed Amber Case's keynote on calm technology and the way we approach tech in general. It, to me, held the strongest connection to the overall theme of Designing for Humans. I can't believe I got to hear Susan Kare talk – such a profound influence on so much of our digital culture. I got a ton out of Dan Ramsden's talk asking what the point of IA was; he's an easy favorite speaker for me. Kyle Soucy challenged my assumptions on KJ analysis and its most practical applications. I took big ideas from Elissa Frankle's talk on the hierarchy of needs in museum experiences, and thought on how they apply to non-physical experiences too.

A favorite moment: on Sunday, a speaker inexplicably missed their speaking slot. The topic was voice UX & UI. A room full of people sat and waited, assuming there would be no session. But there was! Andy Fitzgerald, Shelley Cook, and I started suggesting an impromptu panel and moments later, Amber Case was leading a discussion, recruiting the sole person from the audience who knew voice UX. It was a fireside chat that later turned into a small panel, all on-the-spot without any prep, and pulled off without a hitch.

You Get What You Give

That moment to me symbolized something significant for me this year, something it took me years to absorb: this is a community-run event, with backing from ASIS&T as always, and the community helps design the direction of the event. Mind you, the co-chairs are the leaders – that's not in dispute. But there are so many parts of the conference that sprung up organically. Acoustic Jam, Polar Bear Yoga... these things happened because people just put it together and recommended them.

Thus I'm publicly sharing that I am hoping and planning to have a more significant role in what happens at IAS next year. It's all at the “Hey, I'd like to do these things...” phase with the chairs, but I have a clear idea on what those things are and what needs to happen in order to make them go. And anyone who knows me can guess what they're about. (I am trying to avoid an Osborne Effect here.)

But the rationale isn't just self-serving. I have found my community at IA Summit, and these are the people I learn from, the people I respect. I want and need this conference to be a welcoming, diverse space that continues to bring in new voices – with the wind at their backs, ready and eager to share – while respecting the work of the giants in our field. I want IA Summit to be that place where all of these people come together, sharing and learning and taking exciting things back to work and then sharing and learning again. I want IA Summit to continue to be special for people who attend, and I now want to help see that through from a content and experience perspective.

On Next Year

IA Summit is in my hometown of Chicago in March. Big ideas are in my head: how will I get Kevin Hoffman to enjoy Lou Malnati's? How can we get everyone out into the neighborhoods to explore? What will the influence of the city be on the program? Can we indeed have buckets of giardiniera at the snack breaks? How can one compress the Chicago experience and culture into just a few days in March?

I'm excited to see what happens. I'll see you there next year.

Prior years' posts, for reference: 2013, 2014, 2016. I also spoke at Five Minute Madness this year.

7 Hot UX Trends for 2017

  1. Building diverse and inclusive design cultures with actions, not words.
  2. Solving real problems that impact people, not making a fucking hair brush with Bluetooth in it.
  3. Watching Apple give no fucks about usability, and watching no other large company step in to take their place, and watching everyone just kind of shrug it off because Jony Ive made a video about their new thing and his accent is pretty great.
  4. The fold makes a stunning comeback!
  5. Confusing UI and UX (repeat until the year 2075).
  6. Building tools and relationships to ensure families aren't torn apart by backwards-looking government policies that undo the progress of the last 50+ years.
  7. Development of apps that allow people to cope with a steady stream of bad/shocking/terrifying news and still be healthy.
  8. Bananas.

IA Summit 2016: Handshakes to Hugs

Note: this article reflected my feelings on IA Summit at the time of attendance. As of 2018, I've learned of serious safety violations at the conference over many years. Please consider that when reading this, as my experience may not be typical. Until further notice, you should not attend the IA Conference (which is what IA Summit is now known as) or support the IA Foundation. – Ed.

I've noticed something during my IA Summit visits. There's a moment, usually 3/4 through the first day, when I find myself in my hotel room staring out the window, wondering, “Is it not going to feel special this year?”

I had that moment my first day there, this year. But by Sunday, that was a distant memory. The doubts were cast aside. And I felt a heady mix of excitement, joy, inspiration, motivation, and love.

This is my 4th IAS in a row. I'm not an old timer, but this year I was asked to host a First Timers' Dinner with the great Stacy Surla. It was a fantastic experience, because I reflected on how much I enjoyed my First Timers' Dinner with Karen McGrane in 2013. As all of us dined together, first as strangers, I looked around the table at all of these ridiculously smart and enthusiastic people forging new friendships, and felt grateful.

Leading up to and during the Opening Reception, I met a lot of new people – in fact, the first person who said hello to me was Brandy Fortune, and she offered to share some guacamole (!) I also met Jesse James Garrett (and talk identity and labels with him and Alberta Soranzo, one of the co-chairs) – something I missed out on last year.

As always, the keynotes and talks were motivating. Lisa Welchman shared her personal stories of how small design decisions can impact us as humans. Cory Doctorow took that same theme into the privacy and security space, discussing how big personal data isn't necessarily ours. Leonie Watson used a creative movie quote slide deck (!) to decimate arguments against accessibility and inclusion. And, Jesse James Garrett gave 7 talks in his closing plenary – a clarion call for IAs to shape the world we live and work in.

The theme, “A Broader Panorama”, was reflected in many of the talks. Inclusion, diversity, accessibility, equality, fairness – these may not sound like IA-related concerns but they truly are. Christina Wodtke touched on this in her personal and powerful 5 Minute Madness wrap up, saying that IA “is not neutral”, a nice bookend to all of the keynotes.

During this conference, I had the good fortune of sharing meals with many of my friends – old and new. We had wide ranging discussions about the things we learned, to day-to-day work and lives, to careers. This, to me, is one of the most powerful things about the Summit. Kyle Soucy said it wonderfully during her 5 Minute Madness talk: “This is the conference where a handshake turns into a hug.”

On 5 Minute Madness

This was the first year I was able to stay all day on Sunday, and that meant 5 Minute Madness. I had heard about it after my first IAS in 2013, and only understood it to be a free-for-all that was packed with emotion. And oh my, is it. I knew I had to do it.

Hastily-written notes. I had a plan. I threw it out. Didn't get to the go forth part.

Hastily-written notes. I had a plan. I threw it out. Didn't get to the go forth part.

At the end of the conference anyone can line up, take the stage, and speak. That's it. No set topics.

I made it into the line fairly early, and could feel my heart pounding in my chest. I stood and watched my friends take their turns before me and say incredibly powerful things. I had written notes, but, decided instead against using them. I got on stage and said what was in my heart. Much of it is a blur now, but I remember feeling that I was this close to completely losing it the entire time. My voice was shaky, my eyes watery. As I spoke I looked around the room, hundreds of people, and saw many faces I knew and many I did not know. I said that not only did everyone here see me professionally, they saw me personally as well. They saw me, and it was all honest and true.

I left the stage, feeling completely emotionally drained, and listened in on others until I needed to leave for the airport. I walked out of the conference room, alone. I ran into a friend on the way down the escalator to the restroom where I felt completely overwhelmed with emotion and had a big ol' breakdown. The love, the joy, the community – and it was over, for now. My friends, my IA Summit family, another place I can call home... was gone for another year.

The Closing

The hashtag activity for #ias16 is all but gone. I'm following a lot of new people on Twitter. I have “The Time of My Life”, the song I did at karaoke with Misty Weaver, stuck in my head a lot. I feel empowered and motivated to do better work. I am taking action. I am working hard to bring it. I miss my friends. But they've also inspired me to be better, to do better.

And next year, we'll do it all again in Vancouver.

This is the best conference. These are the best people.

On Apple Music

Many words have been spilled over the mess that is Apple Music, to the point that towards the end of July it felt like everyone just let their hair down and admitted that it's not all peaches and cream. Jim Dalrymple declared he was done with it, and then Apple directly helped him get some of his music back. (Will they help every person who has had this problem, or only prominent bloggers?) Khoi Vinh was so impressed with the power behind Spotify's new Discover Weekly playlists that he said, “More and more, Apple Music is looking like a disappointment.” 

After using Apple Music for a month, I agree. Apple Music suffers from a number of major problems and, worse, it's not a matter of just fixing one thing to make it all work. But, I can say that it certainly appears that the underlying information architecture of Apple's music offerings is the root cause, and the other issues we see – everything from syncing stupidity to poor experiences to UI weirdness – is the IA wonkiness manifest.

I'll talk about the underlying IA, followed by the modality Apple has enforced and how it impacts the UI and search, and then share a parting thought on Apple Music. 

How is this an IA problem?

So let's take a half-step back for a sec. When it comes to working with gobs and gobs of information, making sense of it is critical. And that requires the design of an underlying structure that matches the mental model of a user as much as reasonably possible. That structure drives the organization of the information or... yep... the information architecture. This is often expressed as a sitemap or navigation within an app or site but, those elements can exist independently of a broader IA. With Apple and its ecosystem, we need to look at a much higher level than just the Music app or just iTunes (but, god, iTunes has a terrible IA). We need to look at the product offerings and how they play with each other first.

Here, then, are Apple's major top-level categories in its music ecosystem:

Music (iOS)
iTunes (Mac/PC)
iTunes Store (iOS)

Notably, the iTunes Store is separate from Music on iOS. That's not gonna happen on Android which muddles things up. Otherwise, though, this top tier looks pretty good. Three big buckets.

Let's go down a level with Music on iOS.

Music (iOS)

Apple Music

For You

New

Radio

Connect

My Music

Search

Apple Music

Top Results

Albums

Songs

Playlists

Artists

Music Videos

Stations

My Music

Albums

Songs

Playlists

Artists

Music Videos

Now Playing

Up Next

There is something very important here: the split between My Music (which I'm going to call "your music" because that's a model problem too) and Apple Music. That split, as it turns out, is quite critical to the conceptual model of Apple Music.

When I first saw the unveiling, I was excited by the idea of one search box just finding everything whether it was in my music, Apple Music, or the iTunes Store. That was my mental model based on a seemingly-ubiquitous search box throughout the app. Essentially: I'll pay you $10, just let me have access to music.

That is not the case. Apple has a strong wall between these three camps: where your music is, where streaming music is, and where music for sale is. From Apple's perspective, they have three distinct information blobs out there and there is no overlap.

On top of that, your music itself consists of purchased tracks, burned/ripped tracks, and tracks downloaded from Apple Music. These are roughly speaking the types of music you may have.

But wait there's more! Any or all of these types of music can exist in multiple places at the same time, giving them another dimension. They can be in the iCloud Music Library, they can be on your iPhone, they can be on your Mac/PC, they can be on your iPod, they can be on your iPad, they can be in your purchase history, or they can be in "the cloud".

If you subscribe to iTunes Match, all of your tracks are duplicated in the cloud (allegedly). If you use iTunes Music Library, it does this across multiple Macs/PCs too.

Do you see how this is getting muddled? So if we look at the structure of just Your Music, it's really:

Your Music

Purchased Tracks (from iTunes Store)

On device

In purchase history (not "anywhere", just a record in a database)

Burned/Ripped Tracks (from CDs, other stores)

On device

Downloaded Tracks (from Apple Music)

On device

Cloud-only Tracks (flagged as "yours" in Apple Music)

You can see that the complexity is ratcheting up here. This is, interestingly, a throwback to the old "syncing is hard" problem. Syncing is hard. Because if I add, say, “We're Gonna Make It” by the Orange Peels to My Music, there's a lot of questions going on there. Did I burn it? Is it an MP3 on my computer? Did I buy it from iTunes? If I remove it from iTunes, where does it go? If I buy it and get a copy from Apple Music, which one wins? What happens if I have the MP3 on my Mac, use iTunes Match and Apple Music, and get a copy from Apple Music on my iPhone? Which one is my file? If I cancel those services, do I have anything?

Right.

This is why a single box search that spews out multiple possible types of returned searches can be messy (although, for me, ideal!) When I search for "Janelle Monae", the modality of the interaction now matters. So, Apple generally forces you to enter a mode in order to search. On iOS Music, search is conspicuously absent from the main navigation (I personally think this is a huge flaw). But when I search, I get a split tab right up top: Apple Music and My Music. The iTunes Store is gone because that's in a separate app, so my mental model doesn't even think about buying music here (and, cheers for that.)

How it Manifests

Here's an example of that modality and weird IA problem. One of the things I really enjoyed on Beats 1 was the show Time Crisis with Ezra Koenig, lead singer of Vampire Weekend. Let's say I want to find that show.

I head over to iTunes on the Mac and jump into the "online" side of the search box. I type in "time crisis". And I get... store results.

Not the Time Crisis I was searching for.

Not the Time Crisis I was searching for.

Remember, these are my main uppermost level options within iTunes:

Your iTunes Store/Apple Music menu. Which is for which service? You just have to know.

Your iTunes Store/Apple Music menu. Which is for which service? You just have to know.

So where would you go?

Right. I think, “Ah, okay. It was a radio thing. I'll go into Radio and search.” That way I can maybe tell iTunes that this is the mode I want to search in. Guess what? Nope:

I guess I'm not “in” Apple Music...?

I guess I'm not “in” Apple Music...?

Here is the wall between the store and Apple Music enforced, big time. Turns out, I have to go into an option of Apple Music in that topmost nav: either “For You”, “New”, or “Connect.” In other words, that simple-looking navigation is indeed modal, as I thought... it just isn't the same model I was thinking of.

Once I'm “in” Apple Music by clicking “For You”, I can search Apple Music.

Once I'm “in” Apple Music by clicking “For You”, I can search Apple Music.

Note that the wall is back here. I can no longer search the iTunes Store!

Forcing a mode of working upon people is not an inherently bad decision. As noted above, Apple has chosen to structure this in a complex way. But there are ramifications to the modality that go unaddressed in the UI and UX, and we're worse off for it.

And this is why I won't choose Apple Music

This is a fucking mess. I have to do too much thinking in order to understand where my music is, what constitutes “my” music, and on top of that, I need to deal with the “oops! it's gone!” problems all streaming services have.

No streaming service is perfect. And I like a number of the radio shows on Beats 1. But, remember, they're free anyway; you don't need to subscribe to get them. Given that, I suspect I'm going to edge back to rdio in the next month or so and reconsider using streaming as a primary music method altogether.

Apple Music exemplifies the worst of Apple at this moment in time: it looks great, but from an overall IA and UX perspective, it's really, stupefyingly bad. I question whether Apple has the focus to see this product through, as it seems to be an also-ran that will tick the “Hey! We've got a streaming service!” checkbox and not much more (although I wrote about bigger things coming). A disappointing mess.

 

Tailoring

I turned on the light to my closet and walked in. It was 7:30 on the morning of a client meeting and, as usual, I needed a dress shirt. I don't have a wide range of dress shirts since I can get away with casual shirts at work, and so my selection is somewhat limited. But on looking at my options, I was disappointed. All of them fit well except in the arms which, as usual, were too long. I chose a solid blue shirt, conservative but business-friendly, and went on my way.

---

It's no secret that I'm short even though I look tall on Twitter. 5'6" is my height but, honestly, that's when I'm wearing my Chippewa boots in the fall or winter. I'm 5'5 1/2" to be precise. And, due in part to my short stature, I also have relatively short arms and legs.

When I was younger, I didn't quite understand why my school uniform pants always needed to be hemmed. And as I progressed into men's sizing, I didn't quite get why a 30" inseam on pants meant that I was still dragging a few inches of denim on the floor. But there I was. Once I entered the business world, though, I knew I had to get my clothes tailored to some extent. That meant dress pants for sure (although there was a time where I tried to get away with cuffing – a bad idea) and that was also true for my lone suit and lone blazer.

---

The following week, another client meeting and another round of frustration with my dress shirts. I chose a white shirt, somewhat formal but not too too formal, and finished getting dressed. I had bought that shirt last year during a dress shirt replenish, but never took the time to get it tailored. I wore the shirt but felt really bad about it all day even though it was under a blazer – it circulated in my head as one of those things that you notice immediately but almost no one else would. (“Can they... tell? Do they think I look silly?”) I'm sure it manifest itself in my stance and confidence that day.

Not long after that, I found myself at Nordstrom Rack trying on new dress shirts. I found one that fit really, really well. It was a reasonable $35. I looked at my reflection in the dressing room mirror and nearly bought it. It was a good shirt.

But here's the thing: the shirt in my closet at home was also a good shirt. Its sleeves were just long and a little big, that's all. (If I ever took up weightlifting, and somehow only grew larger muscles in my arms, this shirt would have me covered.) But it was nothing a tailor couldn't fix. I left the store empty-handed, and called the tailor just down the street from work. $14 to shorten sleeves and a week turnaround. No worse than buying a new shirt online, say, and waiting for shipping. I dropped by the tailor and tried on the shirt.

“How short do you want the sleeves?” he asked. Ooooh, the voice in my head thought, he totally can't tell.

I pulled up the sleeves maybe 1/2" or so and said, “That's it. Not much.” He marked the sleeves. I paid my $14, changed back into my other shirt for the day, and was off.

---

All of this made me think about how willing I was to purchase a brand new 100% fine shirt which, I might add, would have likely also needed some tailoring. I was almost willing to spend about $50 for a new shirt that day at the store, versus plunking down less than $20 to take care of the shirt I already had.

For quite some time I just thought that clothes were supposed to fit off the rack, and if there was something wrong with the sleeve length or what-have-you, too bad for you. My stance on this has changed. Yes, it totally sucks that I have to tack on $20 to just about any pair of pants I buy unless I happen to find one in a short length (which, thankfully, happens.) And no, I don't plan on getting my entire wardrobe tailored.

But there's something to be said here about the idea of reinvesting in what I already have versus throwing it out and getting something new. New is appealing. New is flashy. New is... new. It sounds good to have new stuff. Look! I got a new shirt! I got a new phone! I got a new pair of pants!

---

I picked up my shirt a week later at the tailor. I was genuinely happy as I got the plastic-wrapped shirt off of the rack and said thanks to the tailor. I got it home that evening, put it on and...

The sleeves were too short.

They looked ridiculous. It appeared as if my arms had grown out of my shirt, leaving no fabric behind. Worse, the arms were now so short that the shirt pulled across the chest and back. The tailor had ruined my perfectly good shirt. And this, unlike the sleeves being too long, was something others would absolutely notice.

In the end, I made the best decision: to stick with something I had, but update it to reflect how I actually was in that moment. Unfortunately, the too-short-sleeve-shirt was a byproduct of a tailor who made a mistake. While I might go to him again, I am worried to take another shirt there. Pants, no problem. But now I know that this tailor isn't where I'll take my shirts. Lesson learned.

---

There is value in sticking with something that is familiar and seeing it through to whatever you need now. It can be tough sometimes. It may cost money. It may not be possible today. It may be possible tomorrow. It might hurt, because that shiny new thing is shiny and new. But it may, in the end, be the totally right decision... even if there are mistakes along the way.

---

This is based on a piece I wrote for The Weekly, my email newsletter.

UX is dead

UX is dead.

The industry is dying. Our practice is being watered down. It needs to be saved. We need to name it in order to save it. We need to be acquired, acquihired, or go it alone. We need to do these activities and stop these other activities. We need generalists. No, we need specialists. No, we need certification. We need to argue less. Well, no matter: UX is dead.

To be honest, I don’t believe this. I’ve considered it and read about it and re-read about it. I’ve wondered if it’s true. But it is not. The truth of the matter is that UX is in an exceptionally strong place.

Over the past few years notable people made bold declarations about UX. Peter Merholz, for example, said that there is no such thing as UX design. Upon reading his article and other similar pieces, I initially became defensive. They didn’t match my experience; UX was still needed, perhaps more than ever. I’ve worked with dozens of clients over the past few years that absolutely needed these techniques, philosophies, and practices. Without them, simply put, they will become irrelevant.

I’m reminded of another person who said UX would go away. His name is Paul McAleer, and he said that in an interview prior to the 2013 UX STRAT conference – oops. And you know, I still agree with this viewpoint. I do think that in the future, UX will go away as a practice. But that is a distinct position from declaring that it is dead or nonexistent.

This isn’t just a semantic argument. UX grew from a place that was closely aligned with digital products – interaction design, UI design, graphic design, and a subset of IA that is focused on navigation. Understandably this led to confusion of UX and UI. This confusion is noteworthy because so long as we apply UX practices to digital products we run the risk of being deemed UX/UI or UI designers. It’s also noteworthy because we’re just now starting to collectively pull away from it.

This coupling is problematic. Technology in general, UI inclusive, moves at an incredibly fast pace. It has not slowed down during my 32 years in tech and I think that’s fabulous. But some of the best UX practices have drawn from more established, slower-paced areas. We want to stop and slow down and, say, do research… but the fast pace of technology demands we do research much faster. Thus, the practices of UX have always been in a fast/slow position when they’re fully aligned with digital products. This is an uncomfortable place to be, this grey area.

So when it comes down to it, I infer from Peter's piece that he’s had the good fortune to drive UX maturity to a point where the organization no longer needs “UX design”, and it’s fully integrated into the organization. It’s not dead, in that case, just superfluous as a separate practice.

---

Several years ago, my boss – a product portfolio lead – asked me about the future of UX and my centralized UX team. I laid out a prediction in line with my UX STRAT interview, and also suggested that my team would much later be disbursed to individual product teams. But UX was still young in that organization, and it wasn’t strong enough to support a distributed team just yet.

A month later, the entire product team was reorganized. My team was broken up and disbursed.

While I’d like to say that everyone was ready for this change, that was not the case. We ultimately did not have the organizational nor financial support to succeed. In time, the UX team was re-centralized. Later still it was moved out of the product organization altogether.

This concern over where UX lives and the amount of influence UX has often demonstrates an organization that does not consider design a core value and instead sees it as a requirement with no true authority – a checkbox on a list of deliverables. It is common, still, and traces back to the UX/UI coupling.

Similarly, I strongly disagree with the idea that agencies that provide UX services are out in the cold… or will be swooped up by a bank. There will always be a need for an outsider’s perspective, always. Organizations that grow past UX/UI are going to need a hand in figuring that out. Now isn’t the time for a milquetoast approach to design work. This is the time for agencies and companies to truly state their case, hire intelligent people, and line them up to do big work. It requires evolution from the client – internal or external – too: asking for “just an app” isn’t going to cut it in 2015. UX experts will need to assist clients and give them a boost on the UX maturity scale. It’s a change management problem.

I had a chat with a good friend recently who is about 7 years into her UX career. She’s been doing fantastic work, starting with user research, wireframes, and information architecture. But she recently started working on business problems and strategy – to her delight. A concern she brought forth was her place in the community: several events she had attended focused on perfecting skills for digital products… like user research, wireframes, and information architecture. She was concerned that she was outgrowing the industry. At some points, it seems like a lot of training and growth is geared towards new practitioners instead of those of us with a decade or more of work under our belts. But that’s the fast/slow contrast fully at play here. We ultimately have a degree of freedom in our careers and our work that we aren’t familiar or comfortable with because these growth and career directions are being defined just-in-time.

I get this; I’ve felt this too. I started on a path to be a programmer but shifted over to photography at my earliest chance. My art school experience was instrumental in developing my career, even though I didn’t realize it at the time, because it exposed me to different ideas, disciplines, and ways to define success. It sounds a lot like where UX is headed.

A few months ago my team had an offsite event focused on improving processes. One of the many thoughtful workshops centered on career definition. Upon reflection, it wasn’t hard for me to envision a time in the future when I’m not “in” UX. That doesn’t mean I won’t be using UX techniques, Post-Its, or whiteboarding the hell out of things; it simply means that my focus will shift to other areas. I suspect it’ll have something to do with baking, a long-budding passion of mine. No matter what it is, it’ll have something to do with making the world a better place.

---

This concept of our work being bigger than just digital has been a thread that’s been showing up in talks, events, and conferences over the past several years. Abby Covert's superb How to Make Sense of Any Mess is an IA book through and through but is geared towards a broad audience. The fourth edition of Information Architecture for the World Wide Web elevates the conversation above tools to focus on IA fundamentals that can apply to a wide range of problems. Both of these books encapsulate the spirit of what design can do without the trappings we’ve long felt as an industry, coupled to digital UIs. It’s empowering.

That sense of empowerment is something we all can and should build in ourselves. And so, when we’re working on UX – either formally or informally, in the field or not – it’s incredibly important for us to devote time to working on ourselves. If your interests are outside of the digital sphere, that’s actually a great thing! Learn from that! Work on that! Be present with that! Use your skills to see what tools and techniques you can apply in those spaces too. The best way we can infuse ourselves with that sense of possibility and wonder is to keep fresh on what’s happening elsewhere.

That’s where I am today. With the help of others, I started taking design principles and applying them to myself. This attitude and approach, where I was my own client and I needed help with everything from research to feedback, is something that helped me significantly over the past few years. I even started to write and speak about it. Taking these tools and techniques to other areas of my life didn’t diminish my enthusiasm for design. In fact, it supercharged it. I’m bullish on UX.

UX is absolutely not dead. UX is a field that ebbs and flows with new energy and new talent, echoing what came before. And it’s simply too big to be “just” for apps and websites. As high technology continues to push itself out of dedicated computers and phones, our skills and abilities will be needed in many more places, in many more industries.

But we can’t forget why we do this work. We’re curious. We’re intrigued. We can help others. We can inform others. We can ensure those without voices are heard and respected and understood. In the end, UX is all about people. And because of that, UX will live on for a very, very long time.

UX is alive.