Transcript from Erika Hall event
Read all of the questions and answers from "Getting User Experience and Service Design Research Right at Scale with Erika Hall" on April 13, 2021
Transcript shared by Michelle Cummings
Full recording on YouTube: https://www.youtube.com/watch?v=KJ3rf5QbUkg
We had the great privilege to host Erika Hall, author of the seminal design research bestseller "Just Enough Research" to discuss in-depth service design discovery and research methodologies, tools, and frameworks.
This event was a collaboration with our friends from the UX Research & Strategy group (hence the different video format with the small video in the corner).
Some of the topics we discussed:
Q: Research is a very fungible domain that can go very deep and very broad. The very first high-level question; we are all using the same tools and methodology to do some research qual & quant, stakeholders interviews, etc. In your mind what is the biggest difference for a service designer approaching a research plan versus a UX designer approaching research as well?
A: That's a very good question. When I think about research there's no significant difference because it all starts with the same process: you have to be clear on your goals which is the #1 most important step and something that I think a lot of teams and organizations kind of skip through. They skip to the methodology. They say ‘oh we're going to talk to users or customers, we're going to create a journey map’ and they don't take the step to stop, pause and ask what are our goals? Like generally; business goals, organization goals. How do we know if we're successful? Get really clear about that. Then once you're really clear about what success looks like to you, how you want to change the world for the better, then you say, what do we need to know to do that? Where are the gaps in our knowledge? Where are our assumptions?
That's exactly the same no matter if you're doing UX design, service design, marketing, starting a business, planning a vacation. It's the same process. I think what happens is people get into fights about methodologies because they haven't done the work up front to get that shared clarity and that's where you get the; what method is better? What documentation is better? It all depends on what you are trying to accomplish. And then once you have that clarity and once you know where the gaps in your information are and what you need to know in order to make the decisions to be successful then you choose your approach and your plan based on what you need to know. It’s the same for any sort of applied research I think.
Q: Absolutely. Second question; so for researchers and designers it's often hard to get buy-in for the time and budget or the perception of time and budget to do research. We’d love to know
what have you found to be the most effective way to get that buy-in?
A: I'm really glad that you use the word perception there because the first thing is to recognize that all of the objections that are based on schedule and budget are a complete fabrication. They're a total smoke screen. There is plenty of time and money. If there is time and money to do whatever the project is at hand. If there's time and money to run a business then there's time and money to become informed and I think those objections are raised because it's an easy way for somebody with power or influence in an organization to avoid doing research because they have a thing they want to do and they don't want pesky reality to interfere with the thing they want to do.
So once you recognize that time and budget is just an objection that's being thrown up, if you go back to your questions because a lot of times there's an idea that research needs to take a certain amount of time or amount of money to be done well and that's not true. If you know what you need to know and you know what you're going to do with the information you can answer your questions in a process that can take varying amounts of time.
The general rule is; the bigger the decision, the more time or budget that you're going to be investing in whatever course of action that you're trying to inform, that tells you how much time you should be spending making sure that you're confident. It also depends on what you know and what you don't know. That's why that first part of the process I talked about is so important because you might get everybody together around a table and say what do we actually need to know and you might find out that you know a lot already or you might find out that you have some pretty big assumptions that are just made up wishful thinking and you won't even know how much time or budget learning is going to take until you do that.
Q: What do you suggest teams do when they're told they can't or shouldn't do research?
A: Redefine research to be something that has been permitted explicitly.
Q: Any tips for how to do that, to reframe that?
A: The most important research that you need to do is internal. You need to understand the decision makers (the people who are controlling that aspect of your reality), you need to understand them because if you understand those people well then you can explain to them how your work makes them more successful.
One thing to let go of is don't convince anybody to value research. Who cares? Who cares what you call it. You want to make better, more informed decisions. You want to increase the chance of success. You want to decrease risk. So talk in those terms because research to a lot of people sounds like wasted time, wasted money, time not doing work which means not manufacturing visible artifacts. So you don't need to get somebody to care about research. You just need to find a way to do the things you need to do to make whatever you're doing successful.
Do research on your stakeholders to understand what makes them tick and then reframe your needs in that way. That's something a lot of teams just don't do. I have a whole line of consulting business which is like people come to me and they're like how do we get stakeholders to buy into research and it's like well, do you know what their basis of decision making is? Do you have an explicit framework for decision making in your organization? And they're like no.
There's this idea that there's a right way to do things that involves a certain set of artifacts or methods and that gets really dogmatic. The worst thing is a dogmatic researcher who's like oh we only do things in this one way, as opposed to always asking the question: what do we need to know and what's the best way to find out?
Because sometimes you can learn what you need to learn in an hour. If you have a good question you can go out on the internet and do what we all do in daily lives. This is the thing that's so wild to me. Just to live life right now in the 21st century we are constantly researching. Google is like alpha but is worth a tremendous amount of money because people are always forming queries and doing research but we don't call it that. You would never order dinner, you wouldn't order takeout without doing research. But you don't think about it because it's just natural. Would you just put your hand in a door and they hand you a bag and that's dinner? You'd say no, I want to read reviews, I want to talk to people, I'm investing $50 in takeout. But ah, we're doing a million dollar project and it's like oh we have to do research. That's ridiculous.
Lauren: I love that analogy. I'm gonna use that with my partners next time they say we can't do research and like you research everything. That is great.
Erika: All the time, all the time. But all of a sudden you're like ah… The thing people don't want to admit is that what they really want to do is be proven right at work and they want to have the authority, they want to have the truth, they don't have the answers, they want to have that special you call it intuition. That's the word you are all the time ‘I have my intuition. Intuition is a combination of experience which is a form of research. You experience these things, you've seen these things and confirmation bias. You want certain things to be true and you're bummed when they aren't true. You're bummed if you wanted to build a feature or you wanted to design something a certain way, you talk to the customers & you find out they want something super boring. This is a thing we don't talk about enough in service design more but not as much in UX design. A lot of what really meets people's needs, like really truly is super boring to design. I don't think it's boring because I think it's really cool to do things that make life easier for people. But a lot of times the best design solution is to have writers. If organizations hired way more writers than engineers or visual designers I think they could actually improve their software tremendously. But that sounds boring. That's not technology. Writing is a 6k year old technology or something like that. That doesn't count as design. And so people want to do the new cool thing, ‘oh i want to have a lamp that talks to me, cool.’ That's not what people want. I think at the heart of all these objections to research is if you really set aside your biases and you really ask questions like what do we need, the people doing the work don't want to hear those answers.
Greg: Yeah so true. Great great insight there. So, Erika I've all been part of projects where either there is as we discuss there is no research no value for research a little bit of research or sometimes too much research right then we have this uh synthesis paralysis if you will so I want to know from you when is it enough when you actually reach critical mass of research. When do we have enough research to turn into insights that can be turned into ideation
prototyping something to build to convince people to get extra funding? There's a lot of people
at play here right. Everybody has as you just mentioned some people get kind of like antsy it's like nothing has been happening for the past six weeks. I want to see something right so there is a need for business to see stuff, so when do you know you have enough to start building stuff? What is the threshold?
Erika: There are a couple of answers. Just reset the whole organization's expectations that you're ever done researching. If you went to a software company and you ask okay when are you done coding? When can the engineers just stop? The answer would be never because you're continuously improving. Continuously improving. So I'd really like organizations to reframe research as continuous learning. If you're making stuff, if you're producing or delivering, if you're making decisions, learning accompanies that every stage. It's the same thing.
We're always learning. You don't want to stop learning because the second you stop learning that's when you get all the risk, when you're not looking at the world. So, you should have the mindset that of course we're always going to be learning. If you take the word research out of it, just ditch that word and say ‘okay what do we need to know to make this decision?’
There's a fantastic quote that I pull out all the time from the Wall Street Journal which is very comforting to business people. It's from an article called The Power Of Thick Data and the quote is “All business is placing bets on human behavior.”
So a good way to figure out enough is to ask the question: how big of a bet are we placing? If you're placing a really big bet you should probably be better informed. That information can come from a lot of places. Maybe you don't need to do as much formal research if you're working with a team with a tremendous amount of experience solving this problem, who genuinely has worked with this audience before or if you're doing something and you have big questions and you get together and you ask these big questions you need to do more work to find the answer. But again that kind of work depends on what you actually need to know and whether somebody else hasn't asked the question already.
What I really recommend teams do is, everybody wants to brainstorm ideas. I think brainstorming ideas is hot garbage because what that does is it puts people in competition with each other to prove that they're right or prove that they're smarter which is already anti-collaborative, get your team together and you brainstorm questions.
What I want everybody to start doing; get in a room virtually or otherwise and you say okay where are our greatest unknowns? And you just get them all out there and then you rank them by what's the actual thing that we just don't know and what are the highest priority questions like what's the biggest question that we have the least information about? And if our assumptions are wrong that's going to tank us. That's the one you answer.
In addition to never being done with research you have to let go of being certain. A lot of times that's the question behind it. Asking when are we going to be certain that we know everything? You're not. You're going to be confident. It's exactly like the analogy about ordering dinner or planning a vacation. There's a point where you have sufficient confidence. Where you're like okay everybody in my flat has agreed we want thai food, we're really hungry, so it's more important that we get the food fast then we get the very best thai restaurant in the city. There. Boom. You've identified your needs. You look on the delivery website and you say ‘oh this thai place is the one that has four stars and delivers in 30 minutes’. There you have enough confidence to make that decision. But, if you're picking a caterer for your wedding which is the highest stakes food specification I can think of, you're not just gonna sit down and say ‘okay who's who can feed 50 people on Tuesday?’ That's not how that decision gets made. So you do more research. You look at the caterers, you read reviews, you go out for tastings. It's exactly the same in an organization.
But everybody's brains are broken because they're so afraid of all the other people they work with. It's not hard. The thing that's hard is getting that collaborative environment in place so people don't feel like if they need a little extra time or if they admit they don't know something (which is step zero for learning; admitting you don't know). If you're not in a safe environment then constantly all your brain power is worried about not looking stupid in front of your colleagues. Once you get rid of that and you have a collaborative environment where people are clear on the goals then it's cool. You're supporting each other and you're not freaked out that you have to prove that you know more than somebody else.
Greg: Do you have a tip or trick because confidence is very subjective obviously. If you have five people in the same room there's five different opinions, five different confidences, so do you have a tip to get alignment on confidence? Also is there such a time in space that you can actually have too much research? Actually have too much confidence? When do you know you've actually crossed that threshold you're like okay now we know too much we need to pare it down because we actually answered six questions of two?
Erika: I don't think you can ever know too much. I guess the way to think about that is if you're not focused the most important thing is just being clear on your goal. It's all conversations. So much of the important work is just talking to people and that's very scary for people because everybody wants an artifact that they can point to or they want like ‘oh look at all the code’. We look at all the stuff we've made. Look at the diagram I've made.
So much of this is if you get together with other people and you're like okay so we're gonna make this decision. We've agreed this is the goal, this is our recommended course of action, we're basing it on this that we've learned, here are some remaining questions, how does everybody feel? Does anybody have any objections? Can anybody find any weaknesses in our course of action? If you have an open, candid discussion like that it can be really fast. But one of the other problems with asking questions is people are really afraid of questioning a course of action and that's why things happen that are bad.
First you need a room of people that genuinely support each other, that genuinely see shared success. When you have that then somebody can raise their hand and say ‘oh what about this? This seems like it might be a risk.’ And then the other people will consider that and say ‘oh gosh I think you're right. How do we reduce that risk?’ And you just go from there. There's no magic answer. It's critical thinking.
It's getting to a place not of consensus really because consensus is like uh we're all agreeing and nobody's arguing. Collaboration is where you have clear standards of evidence and clear goals and you can argue, but you know that at a point we have to make a decision.
We've worked with a lot of clients in media and journalism. Journalists are the best clients because they know how to do this. If you've ever seen a front page meeting for a paper (for the papers that still print) and have front pages, they spend an hour with all the editors from all different departments. First they'll review the previous day's front page which is a representation of what that organization thinks is most important going on in the world (the shortest retrospective). How do we feel about yesterday's decisions? Were we right about our prioritization? Could we have done something differently? Okay. Great. Moving on.
They have one hour to decide what's on the next day's front page. Everybody makes their case. They argue and maybe say ‘the story about politics is huge this should be, this big’ and somebody else says ‘well something else has happened like say it's the pandemic and we have a Covid discovery and we feel that's really important’. Because they have a clear, shared set of goals as well as principles and they all feel like we want tomorrow's front page to represent our best collective judgment.
They show that they can fight fight fight and then they say; okay the hour is up we have to decide, tomorrow's headline is going to be about this, it's going to be this big, this many column inches for this story boom boom boom. They ship it. That's every day the paper comes out. It's exactly the same process in every other kind of organization or it should be. You have a deadline, you know what your deadline is, you know your goal is. If you know what your
standards are, if you know what quality is to you and if you know what the relative priorities are you can make those decisions and you know that sometimes you're gonna be wrong. But your goal is always to manage the risk. That's just what it is to do business.
Greg: That's a great answer. Thank you so much Erika.
Lauren: I have one follow-up question then I'll get to the next one. Something that comes up a lot when designers researchers are dealing with business analysts or people that are very data centric, how do you manage the reducing uncertainty to make a decision against the non-statistically significant qualitative data? How do you overcome that?
Erika: Step one is, of course, you have to get that collaborative environment. You have to get to know each other. Interview those people, whatever the process is, so everybody feels that shared success. That has to happen first. Then, once you have that, the way to think about qualitative and quantitative data is you need both.
But you have to figure out in advance going back to the goal and the question. The thing everybody wants to speed. If you have a qualitative question you can't answer that with numbers. If your question is: how do people decide what to have for dinner during Covid times? If that's your question because you're trying to help out with that you can't math your way to an answer. What you can do is learn about what people are doing through a variety of different methods. If you're like ‘oh we think people are doing this.’ Okay. Well then what's our quantitative question? Do we need to know how much money people are spending? Do we need to know what portion of Americans are using Doordash versus going to the grocery store?
There's this bias that quantitative information is somehow better than qualitative because again everybody's afraid of looking stupid in front of their colleagues. Numbers are false certainty. A number is just a measurement. If you go in the real world, again outside business, where people get all wacky, we know this. If somebody was planning a vacation and asked their friend how was the place that you stayed? Tell me about it. And the person replied “$300 a night.” What do you do with that? Do you know is it good for kids? Is it romantic?
You have descriptive qualitative questions. You always get in this thing about prove to me the value of qualitative data. You just have to reject that because that's a garbage framing. What you have to do is go back and say; what do we need to know? What do we actually need to know? What will it take for us to be confident? The most important thing is to have that collaborative basis. So, we need to know what's going on in the world before we can measure it.
Qualitative knowledge has to go first or you're going to be using the wrong instrument to measure things. You're gonna be using a ruler to check the temperature which is a lot of what people do or you're gonna be using garbage survey data and you're like ‘well it's numbers so it's more important’. Numbers don't have any context.
Greg: Erika I have a follow-up to the follow-up question if I move on next question I have a very particular example. We had Mark Stickdorn on the show last year and he mentioned a very very large German telecom company who had a very thorough very disciplined approach to
what they were looking for right. They had questions and they went to market, did the research
and they were very disciplined about it. They get to place they realized they had so much data to analyze that it was too much. They would literally go back to business like no it's going to take us a year and a half or two years to go through all that stuff to go back to you with some recommendations and so forth. So they were doing such a great job at research so to speak that they got to a place where there was too much research and then they were like what am I supposed to be doing now because business wanted results and they were told we can't give you results for like another year which was unacceptable. So what would you have done and what would you recommend in that example?
Erika: Great example. I call that situation a lot of data and no understanding. They weren't good at research because you can't be good at research unless you know what your question is. All they did was they went out and hoovered up data. Data doesn't have any value. Having more doesn't mean you did anything better. That's a huge misconception and a huge assumption. It goes back to people's own insecurity in how organizations set up these toxic environments for decision making. If you have a good question your goal should be to answer that question as quickly and confidently as possible. But you have to reframe the whole thing. All they've done is create more work for themselves. They didn't solve their business problem. They optimized for gathering a lot of data without knowing why. You have to know what you need to know beforehand or else you're just going to create this huge problem for yourself.
I've actually worked with clients and this is super fun. It's a super fun kind of work for me. I was working with a much smaller organization in that situation. I said okay we've got 30 marketing analytics consultants, no joke. We have all this data. We've been fighting for a year with our agency of record about the direction we should take this design project. We keep fighting but we have all this data and we don't know what to do. They had all these numbers. They were fighting about what the numbers meant and no joke, we went in there and within three weeks of getting everybody in a room, figuring out what decisions they wanted to make, figuring out what they needed to know, doing the first half dozen interviews. All of a sudden they realized how wrong they'd been and that they'd just been asking the wrong questions and we fixed their situation.
It takes me six weeks to go into an organization and fix exactly the problem you described. I would say any organization it takes six weeks because it takes one week of how did you get in the screwed up situation, one week of interviewing all the stakeholders so I understand what their pathologies are, then one week of figuring out what you need to know now given what you've learned (the situation you're in and your goals) and then two weeks of usually some qualitative work to fill in the gaps. Then I'm like go with God and their shit is fixed. It doesn't take that long. But it takes not being afraid of anybody in the organization. I'm not smarter than any of my clients. I'm just not afraid of the same people they're afraid of. I don't care about getting fired. I don't care about them thinking I'm smart. Everybody is so concerned with looking smart that it makes them stupid.
Greg: Interesting. All right. That's the perfect answer right there. Can we just stop the meeting now.
Erika: I'm very efficient. Very.
Lauren: It sounds like it and you don't care. What methods do you like to use when you're really tight on time budget or other resources?
Erika: It's the same one. Get everybody in a room and figure out what you don't know, which takes an hour. Do that. Question prioritization because that's the method everybody wants to skip past. It’s like what do we actually need to know? Because the greater clarity you have about your question and then when do we need to know it by? If your question is again I'll use the same question, how do people figure out what to have for dinner? Okay, we've got a week. Great. Start from what you need to know and how much time and money you have and you can learn something.
There's a lot of stuff on the internet and talking to people once you know the right people to talk to. It takes a very small amount of upfront conversations to save so much time. But then researchers fall in the same trap that I was talking about designers and business people falling in, they feel like ‘oh I don't add value unless I go through a really formal process.’ But if the point is learning, what if you're able to learn something super fast and not using a fancy tool?
If you're talking to Indi. I love Indi. She's a good friend of mine. I never see her because she lives across the bridge in Marin and so we just, we've spent like four years talking about getting together but we only talk on the phone.
Indi talks to people on voice calls. That's her problem space research and she can learn so much. You can learn so much if you talk to the right people and you know what you need to know. That's the step you can't skip. But it feels so scary for people. People talk about Jobs To Be Done. Who cares. If Jobs To Be Done works for you in your context, awesome. If it's the wrong thing but it's what everybody's decided is the safe tool to use because nobody will yell at you for it then that's bad. So, you just have to get in a room and talk to each other and say okay what do we actually need to know, by when, how confident do we need to be?
Yeah it's terrifying because you can't hide behind a methodology or a document. You have to admit you don't know something which is the hardest part. But it's the question. It all goes back to figuring out a good question, which you can do fast if you're not trying to solve for all these other goals which are not get fired, not get yelled at, have the quant people not dismiss me.
If you really are honestly in a place where you're like we want to learn something you can learn things really fast like really fast.
Lauren: Yeah it sounds a lot like get back to basics and really focus yourself and get it all out there. I noticed a question in the chat earlier when you were talking about vulnerability and admit what you don't know. How do you help people overcome that fear of feeling or looking stupid?
Erika: If you're a person in leadership the most important thing you can do is model that for people and admit that you don't know things, confidently and enthusiastically. It's an opportunity to learn. Anybody with any sort of standing in an organization should, as much as possible, just do that. Just be like ‘Huh, I don't know.’ Try to catch yourself making up an answer because this is something once you start looking for yourself like somebody asks you something and you realize you're speculating. That's a good practice.
A long long time ago we were working with some folks from the Wall Street Journal who put on a conference in 2007. Bill Gates and Steve Jobs were both speaking and there were all these other businessmen, it was a big deal business conference. There were a lot of heads of very large enterprises and they would get on stage in these interviews and when they heard these questions about the future of the market or business or whatever they would speculate speculate speculate. Then Steve Jobs got on stage for a first interview. This was the only time I saw him speak in person. Walt Mossberg asked him to speculate on the future, like what do you think the future of streaming media is - something related to the Apple tv or something and Steve Jobs paused he said “You know I don't know. But I'm really excited to find out.” Seeing that after seeing all these other people at the head of Viacom like blah blah blah, numbers numbers numbers, projections projections. To see Steve Jobs just admit I don't know. Why would I know that? Nobody knows that. But isn't it exciting that we're creating the future as designers, as technologists? Isn't that cool? Man, he made everybody else look like an idiot and it's that.
But everybody rushes to speculate. We see this on the news, cable news. If there's a disaster and there's no information people speculate. So it's like stop speculating! Stop rewarding that in your organization, on your team. That's how you create the culture. It's one conversation at a time. You just say okay how do we know what's the basis of information? so you can start to separate out; this is just something I hope is true versus I have evidence for that. And don't bust people in a hostile way. But just create that culture of always always questioning. Like is this the right thing to do? Is this just an assumption? Is this really a pattern we're seeing in the data or is that just what I'm hoping we're seeing because that would be mildly convenient or because we've already created this whole product/service along those assumptions.
It's just one conversation at a time reorienting people around always asking questions and making decisions and making your decision process really explicit. But it's just building that practice and getting away from ‘oh this is the right methodology’ or ‘oh this is the answer and we're defending our answer.’
Lauren: Yeah. Love that. It's all about the culture and how it's modeled. Greg go ahead.
Greg: So Erica, what are some clear ways you just told us a lot of them so it's basically about collaboration and building confidence. Can we dig a bit deeper and give us some more creative ways to share research findings so the insights actually are bubbling up internally and you actually can get traction entirely so we can just move on?
Erika: The process that's very common where researchers go out and get a lot of data and then try to make their case to some decision makers. That process is doomed. So much research gets done, very good, a lot of data is collected and then it just gets ignored. The biggest problem I want to solve is getting people to not ignore things. The best way to do that is to involve those people. Whoever you want to act in a certain way, have that conversation in the beginning about what do we need to know? If you get them invested in asking the question and participating in that process which again is often a conversation you can do in an hour or 90 minutes.
It also depends on how much you know. There's technical debt and design debt. I think there's also critical thinking debt. Which is huge in a lot of organizations where it's like ‘ah we've never actually been honest about what we know with each other and so now things are going to get ugly for a little while before they get better.’ But if you start and you say okay we want to learn this, we want to learn this for this reason; because it fits in with this goal, what do we need to know? And you actually involve them up front in the prioritizing questions; what do you think is most important for us to learn about? What do we already know? Where are the gaps? If you get everybody who you want to use the information at the end of you know whatever your research project is involved at the very beginning they're invested right. They've got context and then keep them involved along the way. Have them participate as much as possible whether it's just listening in or talking about recruiting. Make them participants. There are ways to do this if you design the process well with this in mind. There are ways to do this where people can kind of touch in and out.
Everybody's got this obsession about research repositories which I think is hot garbage too. The world is your research repository. If you have a good question you can dip in and out and not just go back to ‘oh what was that study that we got permission to do two years ago that's like the only thing we could learn.’
If you design your process to be participatory and conversational and have this understanding of ‘oh we're all just learning, we all have a shared understanding of what we need to know and what our goals are’. I keep saying that over and over again because it's so rare and it makes everything easier. The best way to share findings so the insights are internalized is that if everybody knows it's not a surprise at the end. It's not like, ah new information! Everybody has this bad idea that research results in a “aha” with new information. We already know 99% of what we need to design things really well. The question is like why do we still design things badly? And it's because people just want to make things they want to make without reference to reality. So it's actually kind of like retooling the process because if you just give somebody a report and they have not already bought into the process there's nothing you can put in that report that will make them care.
Greg: So Erika we know these are very siloed already we find this every day you know this is where this is what exists to break the silos right but within the business design is siloed and within design research is even more siloed and sometimes kind of like different building different city a whole different place. So how do you recommend to actually get that engagement? How do you actually get the people engaged involved in the process that may be in a different country?
Erika: That's a really good question. I had some researchers in a workshop I did once
who said they were in a different building that had a different air conditioning system. They had more oxygenated air because they thought that researchers needed to breathe better or something. It's so weird. That's a big problem. I think if the researchers in an organization make this their project that is the most important research project: understanding the organizational context in which decisions are made and reprogramming people kind of one at a time. There's no shortcut for that. There's no magic shortcut.
The other thing is to retain your sanity. Recognize that. Say we can't change the whole organization at once. We're in this situation, we need more shared understanding. That's kind of your research question: how do we change our organization to have better critical thinking and make better decisions? And then just make that a project. It's going to be different in every organization. So, the most important thing is to do an analysis. To say: what are the incentives? Because people only do things because of habits and incentives. What are the incentives for not using research? Look at that. I think that can be surprising and shocking once you turn that question inside.
I met this group of consultants and one of the projects that they worked on was going into organizations and finding projects to kill right. This is the fear. There was an executive, they got a budget, they developed a line of business all out of their head and it's chugging along. If you really said: is this doing anything for us? If you really asked: does anybody need this? The answer would be no. But you don't want to take away somebody's power and budget. So they don't want research. They don't want anybody asking questions of their customers to find out ‘oh you're actually doing this totally useless thing that somebody else is doing better.’ So, step one is understanding all of those incentives for continuing to work in ignorance. Who loses if you learn things about reality? That's a good question to ask in your organization.
Greg: Yeah it's a fantastic question thank you so much. I think we have one more hard question so to speak and Lauren you can go ahead and ask the last question and we can't see if we can have some of the audience questions. Erika you're okay going a bit long because we have a lot of questions okay yeah yeah all right so yeah this is our last prepared question and we'll take some from the audience.
Lauren: What do you recommend to do when all the existing research was conducted by different people maybe a previous research team or there was a reorg or something and you really don't have a lot of insight into the the methods used how the insights were formed but you're encouraged to to you know build from the past and not recreate the insights would you recommend there?
Erika: If you're encouraged to use it that means ‘oh we're not gonna allow you to continue learning’, I would say unpack that. Just because you have existing research, existing research is like content. People treat it like it's useful. We would get into this situation when we work with clients on design projects and they say ‘oh we had some people who did some research last year, we already learned that.’ And I'd say “well, that's great but we have different questions.” You have to shift the value from the answers to the questions because answers have a really short shelf life. What you learned a year ago could not at all map today's reality. But if you need fresh questions and you ask your questions and then you just ask those questions again, you could probably learn fresh things like real fast. There's always this sunk cost fallacy like ‘oh we paid this consultant $200k so what they did must be worth something.’ Have the courage to call stuff garbage and useless when it's garbage and useless. But of course understand your stakeholders enough so you know how do I frame what I'm saying in a way that makes them feel awesome.
Lauren: Right yeah there was a similar question from the audience from Simon is how do you know when biased research is used with questionable methods and how do you know when to spot that bias and see kind of when you can say hey this is this is garbage research we actually need to do something different?
Erika: It's like anything that was done in the past. People get so weirdly attached to that. If you weren't a part of it assume it's garbage. What if it is garbage? Go from there. What do you need to know? It all goes back to what do you actually need to know. Are you trying to save the results of something somebody did in the past? This all comes because there's this idea that learning has to be expensive and time consuming, that you have to try to save something. But if you're reading through a report it's the same way you look at anything in science. It's sort of like you're doing your peer review and you're like okay what was the question? Does this answer the question? Why should I have confidence in this or not? Cultivate an attitude of skepticism, assume that everything is super biased and then look for evidence it's not. Start from the assumption that it's not trustworthy. That's a good exercise to try.
Greg: I have a little bit question of a follow-up on that so we've all been in a situation before you go to your company I'm a consultant, like you are so you're going to company and they have done personas and example like years ago and somebody else came in and we did the personnel and new person has and as you said is the confidence we don't know who did them we don't know why it was done and we have too many of them like we have literally 85 personas, true story. That's kind of the same question right. So it's kind of when you get to your place come like there is inhibited research that no one really knows how it happened, who did it. We have no confidence and how do you go in and either decide you know what let's just throw everything away and start from scratch or how do you decide collectively what's what do we need to keep and throw away?
Erika: Uh yeah, that's a good question. I always recommend when I actually go in-house I work with people I'm like ‘this is awesome.’ That's where you start. You're like ‘Cool, you have so much stuff. That's awesome.’ It's kind of “nailed it!” of research (if you've ever seen that Netflix baking show). Then you say okay, what do we need to know? That's the conversation you have to have. The what do we think we know? What do we need to know? same conversation. And maybe in the course of that conversation they say ‘oh you know what we learned this and here's our evidence for that’ and you go ‘cool, maybe we feel like we do kind of know that.’ but you might say ‘okay well we knew that five years ago, do we know this today? Maybe we need to re-ask the question today? So it's that same conversation of what do we need to know. Always start with the now with the future.
It's the same thing if I'm doing content design stuff. It's like okay what do we need to
communicate? You start forward looking and then you say ‘okay do we have anything that answers that?’, because otherwise you're going to try to save and justify. So, if you go in with the assumption that none of it's useful and you say okay ‘but what do we need to know? Let's pretend that doesn't exist,’ and you start by forming the questions and then you say ‘okay does any of this that we created answer those questions?’ Maybe it does but you have to start with what you need to know, not what stuff do we already have and like you're sorting through it like it's antiques roadshow like is this horsey worth anything? That's how people treat their research reports.
Lauren: Uh I'm dying at your analogies and I can't wait to take all of them back to work with me later this afternoon. Something you mentioned earlier in our conversation about getting great writers and that would solve a lot of problems that we're trying to solve here and Sama had a question if you know any examples of great organizations that are great because of their writing along with the designers and developers? People are just looking for that example.
Erika: Apple. Going back to Apple. Why is Apple so awesome? Look at how they use language.
Look at how they value language. They're a big one. Look at their messaging; it's a few words.
Look for the people who use very few words very well. There's this idea of more. More is more. But when it comes to clear expression people who are really concise & succinct communicate super well.
Let's see who else is really... I'm trying... who else is really easy to deal with and good at it. I'm always terrible at coming up with examples of things like that on the fly so I'm gonna look at my phone and be like who doesn't suck. I like Mailchimp. Mailchimp is pretty good. They've really cared about it. In banking Capital One is pretty good. Anybody who uses language really clearly and doesn't seem to have a lot of it is pretty good.
Lauren: yeah I love that yeah.
Erika: If you look at interfaces they're mostly words right? It's just and the graphic design is to put the words in the right place. So, you see them but everybody starts by drawing boxes. The words are the important part. That's the meaning. That's a different book though yep.
Lauren: Absolutely. There is a question from Michelle from earlier; if you are prevented from talking directly to users and customers what kind of secondary research do you suggest or do you even suggest doing that or you push forward with the primary users?
Erika: I would also yeah two things: 1.) Why are you prevented from talking to customers? That's ridiculous. I know a lot of times there are constraints because of it. I've worked with a lot of financial organizations. You can always talk to people who are like your customers and this is why recruiting is so important. If you're doing user customer or audience research kind of things you need to be really careful who you talk to and get good really well-developed people. There are so many platforms now that promise to deliver you research participants. I would say no. Don't use those. They're terrible because the quality of your insight totally depends on whether you had participants who were truly representative of the right people.
So in a situation where for whatever reason whether it's regulatory or whatever you can't talk to actual existing customers or users you can talk to people who represent those people. You can talk to people who are proxies. If you design your study right you can always find those people and again the ‘oh you're not allowed to talk to people’ is usually code for I would prefer not to learn things about how people are in the world. But you can often find things but all this prep to what do you need to know. You can look at analogous research too because people tend to get really specific about their research questions way too fast. It's like, do you want to know how people use certain devices in their day? Do you want to know what's easy or hard? We know so much about what people want and need already. The knowledge is really out there but people are just finding ways to justify doing things that aren't good because they sound fun to make.
Greg: Erika I want to be receptive of your time everybody's time over here we're kind of like stretching a bit of time everyone here has been reading your book what is Erika reading right now?
Erika: Oh God. What am I reading? Hmm...let's see those 35 books. Yeah. I haven't read these 35 books. One that I really like because I've been reading a lot about architecture because I think a lot of things like our field of digital interactive design is really relatively new and we're struggling with a lot of things around practices and ethics. I think like architecture is not perfect by any means but I think there's better developed thinking and better developed scholarship. This is a book I like. It's a little too neo-liberal, it's called Order Without Design: How Markets Shape Cities. I think this is a really good one for service designers or UX designers to read because it's about how we're just we're participating in systems and the systems that create the built environment are really inflexible relative to the other things we're doing. But there's still an idea that an individual designer like so much of our practice is based on this idea that ‘oh I'm dictating or controlling things’ as opposed to ‘oh I'm trying to move the way forces go in the world.’ I'm trying to influence really complex systems. That's the reframing of design I like to see and a lot of it has to do with value and economics. Designers don't like to think about business. I'm reading that.
God what else am I actually reading that's super good? David Graber who unfortunately he's fantastic and he sadly passed away last year. The Utopia of Rules on Technology Stupidity and The Secret Joys of Bureaucracy. That's really good. I'm so sad that I only really got into his work after he passed away because I feel like his perspective on the world is so and and the way he thinks about value and incentives and um and so much of the social fiction that we take as reality um shapes how we operate in the world very cool.
Greg: So Erika we have probably like 30 awesome questions from the audience we will not be able to address those so Lauren how do you want to handle this?
Lauren: Well first of all Erika thank you so much for spilling your wisdom onto all of this. There has been amazing positive feedback in the Slack channel and I think we have a What Would Erika Do thread going so thank you so much. Yes as Greg mentioned there are a bunch of questions uncovered but I think we do have a UX Research and Strategy Slack. Jen has posted the link to sign up for our Slack channel a bunch of times but if everybody wants to join our Slack we can continue the conversation there and post some of the questions to have a conversation between the group um and there are awesome researchers here that can help answer those questions or just have that conversation. You're obviously welcome to join our Slack channel as well. But I think this has just been so awesome and inspirational and thank you very much and please everybody fill out the follow-up survey. Thank you so much Erika it feels like we just know you scratch the surface. We can do this every week. It feels like we may do another one in a couple of months maybe. But thank you so much for your help. Thanks so much for being so blunt and candid as you mentioned research can be very academic and very difficult to access and to approach so I'm really appreciating the fact you are just cutting through the cloth and and you know telling us how it is so that was really refreshing. I appreciate it a lot.
Erika: Super. I just want people to spend their time doing things that are more effective and not spend their time doing things that aren't. I know there are plenty of other people out there who have the academic side covered so that's sort of my role. I'm on Twitter too much so if you have other questions you can find me there. I'm @mulegirl. Feel free or email me. That's cool. So yeah no this is super fun. I love answering these questions because the problems are the same all over the world. All it is, it's hard work to chip away at this because so much of these objections and fear of learning.
I guess the closing thought is people are really afraid and that's where so much of this comes from. People want certainty. And the idea that you're never done learning and that you can't just be done, that's what drives so much of this. It is this fear and a lot of the objections and the roadblocks that you encounter are really people trying to protect themselves. Once you realize that then you can stop trying to prove people wrong, which will never be effective. You can always think ‘oh how do I help create an environment where everybody feels supported by each other on this uncomfortable journey together?’ It's always going to be uncomfortable. But it can be fun and exciting and rewarding. But you've got to recognize that the objections come from fear.
Greg: Perfect. Thank you so much and as well thank you to everybody who's been joining us today. We have over a thousand people from Australia to Russia to Portugal to Canada. Thank you so much for joining us. The recording will be available most likely tonight or tomorrow on both channels. We really appreciate having you was good to see everybody and we hope to see you again next time
Lauren: Yes thank you everybody. Have a great rest of your morning evening afternoon whatever it is wherever you are and thank you everybody. Thanks Erika thank you. Good to see all of you, thank you thank you.
Michelle Cummings is a junior user researcher. Her holistic approach captures content that helps us make informed decisions. Special thanks to her from UXRS for transcribing this event for us.
UX Research and Strategy
We are here to serve the UX community: keep you in the know, connected to each other and always learning.