Inside Tech Comm with Zohra Mutabanna

S4E2 Content Research for Better UX with Erica Jorgensen

April 21, 2023 Zohra Mutabanna Season 4 Episode 2
Inside Tech Comm with Zohra Mutabanna
S4E2 Content Research for Better UX with Erica Jorgensen
Show Notes Transcript

In my latest episode with Erica, we dive into all things related to content research. If you are intrigued, then you are at the right place. We touch upon these questions for a deeper insight:

  • What is content research?
  • What types of content lend well to content research?
  • How can you use content research to gain quick insights without costly AB experiments?
  • Can ChatGPT or similar tools influence content research? How can we leverage these AI tools?
  • "Simple language is powerful, but not always easy." Why is that the case?

Guest Bio

Erica Jorgensen is a staff content designer at Chewy.com and the author of Strategic Content Design: Tools and Research Techniques for Better UX, published in April 2023 by Rosenfeld Media. She's a content designer, content strategist, and team leader determined to bring greater respect to the content field. To that end, Erica speaks frequently at conferences including UXDX USA, UX Lisbon, Microsoft Design Week, the Web Directions Summit, and Button: The Content Design Conference, and on podcasts like The Content Strategy Podcast with Kristina Halvorson and Content Insights podcast with Larry Swanson. In addition to working in content roles for companies of all sizes, she has taught at the University of Washington and Seattle’s School of Visual Concepts. Erica earned her B.A. from the University of Connecticut and M.A. from the University of Missouri’s School of Journalism. In her free time, you can find her exploring Washington State’s wineries or hiking with her husband and rescue dog, Rufus.

Credits

  • Intro and outro music - Az
  • Audio engineer - RJ Basilio
Zohra Mubeena-Mutabanna:

Hello listeners. Welcome to Inside tech Comm With your host Zohra Mutabanna. In season four, I hope to bring to you different perspectives and interests that intersect with our field. Let's get started. Hello, listeners, welcome to the Inside tech comm show with Zohra Mutabanna. Today, I have Erica Jorgensen was actually coming out with a new book. So I'm really super excited. Please welcome her to the show. And Erica, take it away. And give us a little background about yourself.

Erica Jorgensen:

Thank you, Zohra. Yes, I am Erica Jorgensen. I am coming to you from Seattle, Washington. And I am the author of Strategic Content Design Tools and Research Techniques for Better UX, which is coming out April 11 from Rosenfeld Media. And it was my little COVID project. It's, it's not little anymore. It's 300 pages of guidance on how to do content research, to better engage with your customers to improve and add to content. And a whole lot other topics are covered in the book as well, we get so excited to get that out into the world. And I'm kind of in shock. I can't believe It's finally the day is finally coming around. It's gonna be published.

Zohra Mubeena-Mutabanna:

Wishing you the best of luck. And I cannot wait to get my hands on the book because I've had a preview. And I think it has such amazing knowledge for you to share, and for me to take away. Yeah, it's I think pretty practical, pretty. Yeah. Intended for people to to learn how to do content design, why they should do it, how to do it, how to humble brag or not humble brag about the impact of it. So I hope it rolled, get in the hands of lots of people who whether they're content design, UX research, product managers, even engineers, I've seen lots of people across different companies I've worked for in different roles get excited about content research, because it makes things better. And that's what we're here for. Right? If we're working in product or content design kind of strategy. That's that's where the magic happens. And yeah, I had done enough presentations at work enough workshops at work that someone jokingly asked me, Are you going to write a book about this? And I did.

Erica Jorgensen:

Yeah, that's how it came to be. I think my my Is that how it came to be? coworker, Tom Reesing, had done a couple of, you know, sort of lunch and learns at Microsoft, and work with usertesting.com, a little bit, we had a partnership at Microsoft with user testing.com. And then did a workshop with the whole company. You know, anyone who was using user testing was invited to a webinar I put on about a content focused research, what words are working, what words are clear? What words resonate? What words are confusing, and why. And I think the why is really the big part of it all. Yeah, I did another another internal workshop at Microsoft. And Tom was like, You should write a book. And I was like, No, I'm busy. Working content design, I'm too busy. Then I thought about it, you know, is it an immediate migrator that day, and I thought about it, and I said there might be something here. So I had worked in publishing a long time ago, when I was in college, I interned at Little Brown, which I think got bought by Time Warner. Anyway, yeah, we're an independent, smaller, smaller publishing house that had been merged with other larger publishers I'd worked ever in the industry and also worked at Amazon, but back when it was a startup, and they sold only books. So it was a little familiar with the way the publishing works. So I pitched I pitched a couple of different publishers. But what I thought the book might look like. And Rosenfeld media was the first one to respond and said, Hey, let's let's do this.

Zohra Mubeena-Mutabanna:

That's awesome. And I think you kind of gave me some lead questions here. Why is content research important? But before that, can you share with us, What do you mean by content research? Because I'm in my head thinking user research. What is content research?

Erica Jorgensen:

Yeah, yep. It's, well, you could think of it as a flavor of user research. So often, when we think of UX

Zohra Mubeena-Mutabanna:

Thank you for that good overview. So research, it's often a UX researcher, taking prototypes, you know, designing content together and sharing them. That's one of the most basic or most common types of user research is you show a prototype to a current customer or a prospective customer and ask them, either watch them witness then clicking through or not clicking through and stumbling or getting through the jobs to be done. But when you decouple that design from the content and just test the words, just evaluate, just ask people to you know what this word means? Explain this word to me, in your own words, where the phrase or I have a really basic example, I could share from an insurance is content research something new? Because when I think of company report where gold, silver and bronze were the flavors of products that we sold. We sold health insurance plans that had gold, silver and bronze versions, and our customers didn't know with silver Silver plans meant. They thought that that was a bronze is the cheapest gold is the most expensive and no one was buying the silver plans. And I thought the HTML on the website was broken. I popped up in the content management system and said what's going on here and I said The code was great, the call to action button was functioning, but nobody was clicking on it. And it turned out that after a little quick online survey, we learned that many customers thought that silver meant Medicare, that it meant older people who have silver hair. So, you know, there's Silverfox expression, people thought that silver plans were Medicare for people over 65 years of age, which wasn't the case. So a simple word like silver was confusing to our customer. And that was eye opening. So I think it's content research to me, I guess, if I, if I were to define it in a couple of sentences, it's, you know, a bias busting evaluation user research and thinking, everything packaged together, of the content that's currently on your app or website, or that you're planning to put, whether it's a new feature a new product, whatever, it's, it's an investigation and evaluation of what's working well, and what's not. And why. Connecting with your customers connecting with your audience and understanding what's going on their heads. And, and therefore taking the insights that you've learned from the act of doing the research and improving your UX. And you can do it in many ways that you don't have to use a platform like user testing, or users or, or Qualtrics. Or do you scout you can those make it very easy, they are not free, but they pay for themselves, I think they're very a great way to do this research. If you don't have a full fledged UX research team, at the company you're working with or kind of need to advocate for or putting time and money and resources against this kind of project. They're a great way to do it, you can use Survey Monkey other survey tools, or you can just coordinate with your customer service team. If you have Zendesk or other tools on that. but from what you just explained, it seems like you're I don't know how there's so many different ways that customer service teams get information about customers, whether it's phone calls, chat, bots, live chats, online surveys, you know, Voice of Customer, if you'd have a customer service team that you can connect with to find out what they're hearing from customers, you can get a lot of ideas for what needs to be tested, what should be tested, what needs improving. And I think that's again, often product teams don't coordinate with customer service or customer experience as much as they ought to, because they're the ones on the frontlines hearing from the customers hearing complaints, hearing the compliments. And that's a huge gap, I think, yeah, it's mind blowing. When you dig into like the simplest of words, the words

Erica Jorgensen:

Yeah, yeah, I think so. I mean, there's 100 that you use all the time, if you're a content designer, or if ways to do it. That's thing. That's the beauty of it. And the your product is anything, you see those words that come up all confusing part of it, I think, when you, when you have a prototype that has design and content together, of course, the time, the way you describe your core product, adjectives or your users are a little biased, like the design can help people your call to action button that the verbs that use your call to along, or they know to click on a call to action button. But if decoupling user research from content research. you just work on the words, just evaluate the words, I think action buttons on each with ADD word. And when you dig in and that's where often, a lot of insights can bubble up. And you hear from customers, like ask them how clear is this to you? can get along really, really helpful insights that can lead you in different directions. You can, of course, use prototypes, What word would you use instead? If any, if you cheerful with the or live content, or lite apps, things like that. But I think word, the questions, so you don't introduce bias, but it's when you just evaluate the words, that's where your jaw hit the floor. Sometimes when you go, Oh my gosh, people don't it's super, super fulfilling and humbling to hear from customers know what we're talking about here. Or this is jargon to our about what's clear or not. And I think you know, we all come from audience, where it's not too as when you work for a company for a long time you start to talk the dumpster fire a corporate, various backgrounds. I've worked mostly, mostly in tech, but also American corporate jargon becomes embedded in the way that in E commerce, like I've worked for will primarily across the you talk. And I was talking with a friend this morning, about, we health insurance company, I work for Microsoft for five years or both had recent job searches. And she was talking about moving the needle and like Who talks like that, like we do we do when did Expedia. So it worked for a lot of consumer brands, a lot of we work in tech, we talk money. And some of that infiltrates tech brands, but it doesn't work in all sorts of different into the user experience for better or for worse, more often for worse. So yeah, I do think content research isn't new. But products. It's like a gut check. So a gut check yourself and your I think some of these tools like user research and user zoom and team to really understand deeply, what's clear to your discount and power content designers, PMs, other people who customers, and then go forth and improve the UX. aren't researchers by trade, to do investigations on their own. Of course. I do think, you know, if the researchers are spread thin, And I think one thing I'd love to emphasize is this is not meant to replace user researchers, they are some of my content designers spread thin, maybe content designers can favorite co workers of all time. So smart, I do not have a PhD. carve out a little bit of time to do this deep dive into into And in human computer interaction a lot of my co the words and phrases that are all over your website repeatedly workers have, and I do not need content research to step on, you know, toe stepping kind of thing and get something that's best to make them work better. And I think it shines a light on the done in collaboration and partnership with your user value of research and the value of content both. So yeah, I at research team. But it is something that you can do on Microsoft had partnered very closely with our user research your own, in many cases. And I think that's content teams and research teams are, curiously the most underfunded, team to kind of prop up the content research practice, and understaffed teams, a product, the product worlds, piggybacked on all the resources they already had in place. We historically, I would like to help change that. And I think had a wiki of hundreds and hundreds of research reports that's one reason why I wrote the book is that shame, we are from years and years of research. And we added content on and to the detriment of the customer experience to the detriment of business performance. If we hired more research words to that we tagged them, so we could do it. We researchers, if we had, you know, one to one ratios with could search on content research as a topic, and see, you know, product managers, content designers, user researchers, and do people understand what a license is? Do people engineering, what a world this would be. I can dream, right? A understand? Is that the right word to use when we're talking person can dream. about software? And the answer that I found was Yes, I did a lot of research on Office, you know, do people understand his office, you know, is is licensed receipt, the right word to use? Do people know what an app is? They don't, if you they often don't, you'd be surprised at what you get comfortable with working for your company, and then what your customers are just like, What is this, and it reminds me of high school or junior high when you had a quick pop quiz. And you got a word

Zohra Mubeena-Mutabanna:

That's excellent. that you had to define. And you get half credit, if you did a, if you give a half baked explanation of a word like you know what it is, you can't explain it. If you can't explain it, you're not confident, you don't feel good about that word. And that erodes customer confidence if there's a word on your website, that people are familiar with, but they are not super comfortable with it, that can erode your brand trust, that can erode, erode customer experience that can erode click through and conversion rates. And those are things that you can fix. But content research can be better. Or you can provide explanations, you can provide tooltips or explanations were needed, like we did at Premera Blue Cross where people didn't know what silver meant, we added a little line of content on the homepage, saying this is an Affordable Care Act plan. You should buy it. You know, if you want Medicare, click over here, we had to add a little bit of content to the homepage, which, you know, designers weren't thrilled about. It was the right thing to do for the customer experience. And it was like flipping a light switch as soon as we added that little explanation to the homepage, which did muck with the design a little bit. But we suddenly started selling them Silver plans, which are the plans that made people that were more likely to renew those plans, and were likely to be happy with their purchase decision. And those are huge commitments if you're buying something that you can't change for a year. That's an it was good that we dug into silver, but it was what if we hadn't I but I can only imagine that we

Erica Jorgensen:

Yeah. So yeah, yeah, gotta check yourself probably would have had to have layoffs for that team. Because the sales quotas weren't being met. The senior leadership was aghast and buzzing around saying fix it, fix it. What's going on here? Why are we selling the plans that we want to sell? But we fix that problem? Then it makes the letters? against yourself,

Zohra Mubeena-Mutabanna:

You got to check yourself. So I think you give me a lot of good information. And I want to kind of tease that apart. Bit by bit. One of the questions that comes to my mind is, you know, when you do user research, you're doing it with a prototype, as as you said, you don't, usually usually right. I'm not a researcher either. But in as a writer, I'm invested in having as much as design is important, content is equally important. And generally, content is an afterthought when you think about user research when you're trying to think about usability testing, right? Let's see about its backward, it is backwards, and what you said about researching your content. I have a bunch of questions here. But one of the questions probably I should ask is, when I'm doing content research, are there particular genre, does that lend well to content research? Or is it any content that you're introducing that you want to have researched can be researched? Some point is right, that would be helpful.

Erica Jorgensen:

When I run workshops or do meetups or however, you know, when I'm talking to people and content design or other product teams about content research, that's the most common question I get is How do we know what content to dive into? How do we know what to research and I'd say, a lot of that is tied to analytics. You know, think about what your most important content is? is it's not always your homepage often you know people are coming in. However, if you're an ecommerce, this is a great example, right focus a lot in the book on E commerce, because so much money is tied to the words, you know, if your words are working, you're going to sell more, and your company's going to be more successful. If you have a core product that you're focused on. If it's KPI, OKRs, there's, you know, a quarterly goal for your team to make something work better. Chill, dare I say, move the needle for a particular product or experience or job to be done. That's a perfect thing to dig into. You don't want to assume that contents working. And gosh, I've seen a lot of incomplete or absent content analytics. And UX analytics That how do you know what's working? Matt has a huge question. That is a huge challenge for a lot of product teams. And often you don't know or you go to evaluate whether you're using Google Analytics, or a homemade bet dashboard. However, you're analyzing your customer experience. Sometimes the tagging is broken, or something didn't get tagged in the first place. There's such a gap there for analytics. So it's hard to know often what is performing well, but often that it's a conversation with your, with your PMS, with your broader, broader leadership team of like, what is the most important thing we're trying to accomplish as a company or as a UX team? Dig into that content? If you're launching something new, a brand new feature a brand new product? Of course, you got to dive into something before you launch it, so you don't make a big mistake and, and laughing? And but the product Bard by Google BARD because that's a homophone with barred. That's not great. That's something that I wish I doubt that there was content testing done on that product name. And to answer your broader questions about like, what types of research you can do. That's that's one of them is, you know, clarity research What's clear or not, scalability research, what's going to scale? If you have certain if you are translating your website or app into multiple languages, you don't always know what works for US English isn't going to work for Canadian English necessarily, or it's translated to any other language, it might not work as well, it might be inappropriate, or offensive, or all sorts of things you can use on any search to for translation purposes. I think there's preference research, there's actionability. Like, are you likely to click on this? are you likely to tap on this on, you're likely to engage with this, and all of these types of research, I think good in terms of both quantitative and qualitative that you can find out just now it's not statistically significant. When you use a tool like user testing, if you test with 10 or 20, customers, you're not going to get something that is statistically significant. You can't go around and say, with huge amounts of competence, 85% of our customers are clear about this word. It's just you know, sticking a flag into the breeze and see which way it's blowing. But that's still, in many cases, actionable. And then when you ask, Well, why or why not? Would you engage with us why or why not is as clear, that's where you get the golden nuggets of gold, my gosh, like I had no regular costume, she was thinking, I didn't realize this is an issue or you know, I'm the qualitative data takes more time to go through and analyze. But that is where often, you'll become snapped by what you learned and will do research, you can find out what it is, what's offensive, what's making your customers happy, what's making your customers less confident that the qualitative is really where a lot of the value comes from.

Zohra Mubeena-Mutabanna:

That;s so amazing. I mean, I think some of the things that you mentioned about the type of content or other the way I've taken my notes is content that is scalable, that is going to be translated, that could be offensive, or to see if there's any bias in built there. So I'm thinking when the nature of the content that I'm going to test is probably going to be something like UI labels, product names, right call to action, but it's generally going to be not long form. Is that right? Or can even long form content be researched?

Erica Jorgensen:

That's very difficult. That's where difficult to do long form. Content research is more difficult to achieve using these tools and user testing your users, it's not impossible, but you got to think about how you compensate people who participate in research is an issue there. You don't want to take up hours and hours of your participants time. Because that's not ethical or right. If you're reimbursing them with like a gift card or something like that, as that's one way that a lot of research gets done. But you don't want to ask people to read reader 72 FAQs and tell us what you think that's not going to give you a lot of actionable insights. If you have, if, you know, if there's a certain topic that's bubbling up with your customer experience team, like, Oh, we're hearing a lot about our chatbot or a particular specific element of your user experience. You can dig in to that. Yeah, you wouldn't want to ask people to read pages and pages because their eyes will glaze over and you can't be confident that they're reading at all unless you sit there and listen to them read online like repeat back to you what they're reading. Not a super great use of time but also what there's less content like that on apps and websites these days, man right, you know, but brevity is brevity is so important. Right? Or I'd say if you're testing long form content that might be a better if you have to find the bad is, if your product team collectively is like, yes, we need to figure out what's going on here from a analytics from a performance point of view, that might be a great time to enlist the help of your user research team and do a moderated interview or ethnographic research, I think that's underappreciated and not tapped nearly as much as it could be. If you can actually be a fly on the wall and watch people using your content in real life. ethnography is an underused technique to improve the user experience, because it's expensive, because it's time consuming. But again, like it's you get what you pay for. If you're going to dive into the qualitative research and find those nuggets. That's a good use of your time and feel like ethnographic research is an under underused tool in the UX toolkit.

Zohra Mubeena-Mutabanna:

Got it?

Erica Jorgensen:

Because it's time consuming and go fast fast Go, go, go. But sometimes it takes time to make a great customer experience a great product. Agreed, agreed he doesn't know yet. ethnography doesn't always drive with a two week sprint cycle, but perhaps it shouldn't, or on. Yeah. So

Zohra Mubeena-Mutabanna:

With all these insights that you're giving me, I am curious to know, because I've done a little bit of user research in my career. And I want to learn more about it. And I have implemented this at one of my previous companies, we had no budget. And I think you touched upon that there is rarely any budget given now, with user research and usability testing that we end up doing. We try to figure out creative ways can we apply those same techniques to do content

Erica Jorgensen:

Creative ways? I guess I'd be curious about an research? example that you but I can think of..

Zohra Mubeena-Mutabanna:

So let me use your Yeah. So let me let me try this. Oh, so you know, one of the things that we read is, when you're doing usability testing, if you get five users to do your testing, you've got a pretty decent idea.

Erica Jorgensen:

That's usually enough to give you an idea of what's going on,

Zohra Mubeena-Mutabanna:

or what's going on. Would that same apply to content research. I think that was my question, because that's what we ended up doing. Yeah.

Erica Jorgensen:

Yeah, I think so. And I think that leads me to an adjacent topic of AB experimentation or multivariate experimentation. You know, I think that's a an amazing tool that has really transformed the world of product development is over here gave me experiment, you can see which version of your experience is performing better, isn't that great? That is fantastic and useful. And executives love AB experiments, because you get numbers, and you can see oh, this is working, you know, 25% better than another version. I think AB experiments are also super resource intensive and time intensive. And I could think of websites where we were longer testing different parts of the customer experience, it could take months, sometimes to get statistical significance, that who's got months, when you're talking about a competitive of product field, you know, if you have competition, breathing down your neck, and taking chunks of customers away from your company, you don't want to wait once before you improve your user experience. So I think content testing can either obviate the need for AB experimentation or inform it. So if you're going to run an AB experiment, you don't go out the door with just you're controlling, like some some version that you're hoping, you know, you're A version is your control. And your B version is your experiment. How do you know what to put for B, you can do a content research study, to get a feel for what what call to action might be an or what, what words, you're going to do more engaging what adjectives boosts the customers confidence. So you don't you're not throwing spaghetti at the wall, you're going out with a more informed version of an AB experiment, which will improve the velocity of your AB experimentation program. But and I think I have escaped experiments, I use content research to avoid, avoid the need for AV experiments. Because there are times set for engineering because they have to be babysat. Because they can slow you down, you think they're a great thing. They are great, they're great, but they're also not perfect. And I feel like content research can do amazing things to help teams move away a little bit from you the experiments and improve things faster, faster. And who doesn't want to do things faster and better? Right?

Zohra Mubeena-Mutabanna:

So correct me here. So you what you're saying is AB testing is a great technique available to us. But you're recommending something different? What is that now?

Erica Jorgensen:

I wouldn't be able to either replace, you can avoid the need for AB experiments by doing content testing. Because think about what you ought to experiment on. It's often the content, it's not often a call to action button color, it's often what does the call to action button say? You can do content research on variations of content to get an answer in like hours or a day, whereas you might wait allotted scored, you know, days, weeks, months for AB experiment to tell you, I think, and I have sort of like a joke I have in the book that we're not making a new COVID vaccine when we're doing content research, where we're making websites. You know, I think it's great to do to have statistical significance if you're doing something like for a pharmaceutical company, but we're not doing pharmaceutical, we're not doing that we're working on websites and apps and the customer experience is important but you don't need statistical significance? Do you really I mean, anyway, I, when I was at Amazon read off and get, we have AB experiments or ABCDE experiments going, where you could see after just a few minutes where things are pointing, it's a little dangerous to sort of like all this is when you cut the line and say, or cut off an experiment and say, now we can take action on what we learned. Maybe Amazon worked, I shouldn't say this, they didn't work too fast. We got so much information, and we made decisions so fast there. But I think you don't need to have a live experiment that's risky. That's you putting content in front of your customers, what if it's an incentive? What if it's wrong, even if you're doing a 1% and 89% split, that's still, you know, hundreds, perhaps 1000s of your customers seeing an experimental version of your user experience. And that is risky, you don't need to take that risk if you do content research. And then so that's one is, you know, replace AB experiments with content research. Another is if you're going to do AB experiments, you can be smarter about them and inform what you are experimenting on by having your experiment version of content informed by a previous content study. Okay? Oh, yeah. Yeah, that is that is confusing. It's two things right there. But I don't want to be totally down in AB experiments. But I hear people talking too much about, well, we have statistical significance, like, great, that's great. But how much time did you spend? How many people hours did it take to run an AB experiment? Often, it's so many more than you think, you know, if someone has to jump in engineering and have to jump in and start an experiment, or they have to babysit it and see, have we hit statistical significance or not, depending on the sophistication of your content management system or engineering, back end, like how you run experiments is complicated and complicated. And, to me, content research is faster, less expensive, gives great results that are actionable. So why not switch the paradigm and do something that's it's not as sophisticated per se, but get to what you need to do your new Max, where,

Zohra Mubeena-Mutabanna:

what do you specifically mean by content testing,

Erica Jorgensen:

Let me give a concrete example to kind of bring it to life, I think it is hard to talk about it kind of in an abstract way, when we don't have the user testing tests in front of us. There's one example where it can share not surprising me for my time at Microsoft, where we had a new dashboard. And it was for IT administrators. And we wanted to know what to name it. And there's a more complicated version of the dashboard. And there's a simplified version of the dashboard. And we got some participants and ask them, What would you name this? We started with a multiple choice question, there were a couple of examples that we thought might resonate. Three to five in a multiple choice format is probably Max, the maximum I would recommend. And interestingly, what let me let me stop there. So we had a multiple choice question. Just start with about, which would you prefer new name? Like, what would you prefer for the name of this dashboard? ABCD. And then, if it was calling up with an open ended question of why, and so see it start with a quantitative, and he said includes because it's not statistically significant. But you're still getting a percentage of what percentage of participants prefer A, B, C, D, or E. And then follow that up with right away with a qualitative open ended question. Now, tell me why. Tell me why you chose to answer you can read the previous question. Multiple choice questions are great scale questions to or greater, like on a scale of one to nine, which of these is clear as to you are on a scale of one to nine? Is this clear or not clear, you can use the scale questions, also on these online survey tools to really, really get a lot of valuable information for that particular dashboard test that I'm describing. It was interesting, because the PMs were very, very mindful that we had a simplified dashboard for more novice users or people newer to this particular area. And then we had the experienced IT admins who, you know, been working as admins for decades or so we thought they wanted a sophisticated name for the sophisticated dash. But they didn't. So I think it was eye opening J understand. And this is sort of a an audience comparative research test as well. You know, novice users versus more sophisticated users, are they really that different there? They weren't. They all wanted something that was more simple, which just kind of kind of fascinate you. And I think when you when we ask them why you know that they're everyone's job is hard. Everyone's customer time. No one has a lot of time to spare when they're at work, like people just want things simple, simple, simple. And that was eye opening and bias busting for that product team, because there was an assumption that they experienced IT admins were okay with complexity. They weren't they are they preferred, they preferred, a more simple, you know, they wanted to kick things. So that was interesting.

Zohra Mubeena-Mutabanna:

That's an excellent example. That's an excellent I think that clarifies to me, Well, what do you mean by content testing as sounds very, it sounds something so easy to do, in fact, right.

Erica Jorgensen:

Yeah, it's really basic at its heart. It's quite simple it at the heart of it, but it can have wide ranging implications for production so that that bias busting like oh, the sophisticated IT admins, they like simple stuff to it, sort of like you know, celebrities that are just like us. It changed our approach for future design work that if we thought that these sophisticated IT admins who were knee deep in complexity and In a very complicated tasks that they had to do for their, for their companies, if they like simple stuff that kind of changed the product roadmap, not kind of but it did it helped us think about the product roadmap or future quarters for future sprints. It will if they don't like complicated stuff, why are we building all this complicated bits? That makes complete sense. You know, it's the whole team was like, Oh, my gosh, this is it's not like pulling the rug out from under us, but making us think deeply about what do our customers really need? What are we assuming? Versus what do we know? So it just like all authors of UX research, it's, it's really, it kind of stops you in your tracks a little bit and makes you go, Oh, I didn't know where I was making assumptions. And you know, what happens when you assume so I think, just really kind of cool to see that kind of thing happen. Where the whole team, whoa, we're on the wrong track, we have to get on and different.

Zohra Mubeena-Mutabanna:

Right, I think the term that comes to my mind is cognitive overload, that you want to avoid, right? And the fact that even experienced joy, users wanted something simple. I mean, with our busy lives, and our being in the rat race, we do want our apps and our user experience to be simplified, so that we can actually do the task rather than worry about what is going on. Yeah, and why or with this content itself.

Erica Jorgensen:

And that's a tough sell. I think for some company cultures, like that's the Adobe hard case to make, unless you had research to support that kind of approach. You know, and this is what our customers want. We're not, I think, you know, simple approach to things is often the best approach. I think that's so true. But that's not often how our consumption

Zohra Mubeena-Mutabanna:

Very true. In fact, I was at this company a few years ago, where there was absolutely forget about user research, the engineers designed the UI. And then as user advocates, as technical writers, we would come with suggestions. And we were told how the heck would be no the users. And like you said, you know, their whole concept was make things as busy as possible, because then it looks like a complex product. And then that is easy to sell. I it just never made business sense to me. Right? You can

Erica Jorgensen:

charge more for a complicated looking product.

Zohra Mubeena-Mutabanna:

Probably. I don't know, I

Erica Jorgensen:

Yeah, I think customers are, I have seen more and more on TechCrunch. And other websites of you know, complicated products that aren't working well, or worst Software as a Service is not worth the expense companies are cutting back, I think you want things to work. Most importantly, as to they work and tip works, they have to be easy to use.

Zohra Mubeena-Mutabanna:

Yeah, as far as I know, I mean, the product worked, but you just didn't know what was happening. There were 20 call to actions on one screen. And they kept adding more. And that's when we were trying to simplify, but there was no way to simplify that it was it was an a UI built in the 90s. And they literally continued with that legacy product, believing that, oh, we are doing great. And this is how a user experience should be. Anyways, it just that's some trauma left from that, I suppose.

Erica Jorgensen:

I am sorry, that is not reminds me of my time at Premera Blue Cross, we simplified a language internal for in all our internal communications and external communications. We simplified the terminology being used across the board. It took over a year to do this. But we overall the complexity of the language, and that helped improve the revenue. Like

Zohra Mubeena-Mutabanna:

I mean, it makes sense. But so that kind of segues into, you know, what sort of a business impact content research can have. But before I dive into that question, that is going to be my question, eventually. But I want to talk about ChatGPT did not have their new buzzword. And I want to know what your thoughts are on not ChattGPT per se, but how it can influence content research? And can we leverage it because everybody wants to secure their jobs? And is there anything that we can do about it?

Erica Jorgensen:

You know, I think it's another tool in the content designers or product teams toolkit. I think there's so much just this morning, today is Saturday, just this morning, the New York Times had a big article about ethical AI and the teams at Google and Microsoft who were either led to leave their jobs or were let go, you know, ethical AI teams have been decimated over the past couple of months, which is awful. But I think if you are checking your bias, if you are open minded, and understand that these tools are not perfect, you can use it like like a thesaurus. Like I always use visual thesaurus.com and amazing tool, very unattractive looking tool, that an amazing tool that will give you all sorts of different words, adjectives, verbs, nouns that are cousins of the word question to me. I haven't used it for my job yet, but I would use it as like, okay, what are the words that we should be testing? What are the words? What are some options or describing this? What words are aligning with the voice and tone of my brand? So I think it's, you have to be smart enough to use it wisely. Yeah, that's a great I think that use it with a grain of salt. Use it with a healthy grain of salt and don't have a heavy grain of salt. Yeah, I think Reddit teams that will use chat GP feature requests or content design teams. I'm not saying it's only because I'm a condor designer. I think that is just a glitch. because it can't replace a human that is, the large language models are that by humans, they hallucinate, when they're not as sophisticated as they should be, and what I'm doing content where I'm diving into the nuance language, and I'm not going to rely on chat GPT to do that in depth, word choice, tone, building, brand alignment, work that goes into content as, yeah, and I was just chatting with someone at Google who worked on their their Chatbot. And it's only as functional as a human has made it. So they will increase in sophistication. I do worry about their, their codebase let people use them and the wrong way, but I think of it as a tool in our contagion toolkit, just like an online thesaurus or, or even just like bouncing, if you're doing paired programming. Well, yeah, is that the contract is anyway, how do you think this, this is land on our voice and tone or not. And if you look at the companies that are doing well, in this wacky economy we're in, they have strong brands, they have strong brands, and their brands are informed by consistent awesome unique and amusing that and that a true sense of a unique voice and tone that connect with customer. And that's, that's a tough thing to do. And that's why talented, experienced content designers are the best at making that customer experience come to life.

Zohra Mubeena-Mutabanna:

Whoa, for that. Thanks. Thanks for lifting us up.

Erica Jorgensen:

idea. Yeah, I think everyone's like all gloom and doom. It's like, no, it's

Zohra Mubeena-Mutabanna:

Yeah, I

Erica Jorgensen:

think like, you know, when cable TV came out, it didn't ruin network TV. It just means different a gym, eradicate radio, it's another it's an evolution. It's an evolution. I do worry about fake, deep fake images and things like that. But I'm from a content perspective, I think it's, it's not going to replace people. But it is it and wacky times around.

Zohra Mubeena-Mutabanna:

We are in wacky times. But I recently came across this term techno optimist, and, and that's, that's what I would like to align with. And like you said, this technology is assistive technology and augmentative. It's, it's not going to replace us. But we have to keep that in mind that it's the data is fed by humans, and it will be consumed by us as well.

Erica Jorgensen:

So when we're at Amazon, we found customer user generated content and was not able to replace the content. And it didn't it was just another like who wants to read like, look out? They're not just picking through like a needle in a haystack departments, customer review, among and all the garbage, you know, just fluffy, unhelpful ones. So yeah, we did have layoffs at Amazon when we thought like Jeff Bezos said, I want a content team to be replaced by the customer. That didn't quite work out. So not to be I don't be overly optimistic. But I think, yeah, It's just an another change just in their twist in the topsy turvy road that we now are working in working in tech.

Zohra Mubeena-Mutabanna:

I like that visual, let's go with that. It's just about a twist.

Erica Jorgensen:

I'm happy and you have to be resilient. And I think I'm lucky enough to have that perspective of like, Yeah, this is another change. But

Zohra Mubeena-Mutabanna:

yeah, I agree. So coming back to the question, why is content research important, important, and how does it impact your business?

Erica Jorgensen:

Yeah, content research is important because it directly influence impact influences impact, it's seen time and time again, when a coworker will run a study and get some insights that would not have been found otherwise. And changes the UX improves the UX, the language to make it clearer to make it more resonant with audience and then that leads to business impact. And I think my favorite example, which I talked about in the book, I think in the introduction is my coworker Trudy was working on Microsoft's invoices for office 1000s. millions of customers get these invoices, and they were spiking constant customer service because they weren't clear enough. And it took a while before this project landed other content design teams lab, because of the back end, we couldn't update it for various reasons. Until there's some platforming work that had to be done before we could actually tackle this project. But Trudy did a series of content research studies using user testing that showed what the customers needed to know, to be confident about their invoices. And only after she did that work, she added more content to them to explain this is a formula we use to calculate your end result of what you owe every month that was not here before. And that is needing $2 trillion in savings every year for Microsoft,

Zohra Mubeena-Mutabanna:

I think that tells me that this is not just for tech companies, this can be harnessed by any

Erica Jorgensen:

you know, yeah. And I think I started out with department, the book's title, because you mean I'm a Content Designer, right? now@chewy.com I was on the content design team at Microsoft. But before that team, I was on a content marketing

Zohra Mubeena-Mutabanna:

It is a no brainer. From what you said, team lead generation. So we use a lot of content research and in content marketing to understand what titles at ebooks would be effective, what social media headlines should be used to get your games and get attention of our customers. It's not just for content design, and product design. But all source of content. And I think that's where you have to know how to draw your boundaries. We did a lot of content research at Microsoft for the content design team. And then the marketing team was like, hey, hey, I just, I want to know what you're doing here. Because I want to do it too. And I felt many people in marketing how to do on it research so they could do their job. I'm like, you're in marketing. I have, my sprint is jam packed, I can't do your dog for you. But I can point you to you can do this in a sprint, depending on how you frame your the person handle and see if you're testing licenses and get you a license and teach you how to do it. Same with, you know, social media advertising, any content that you put in front of your customer can benefit from conduct research. And that way, it's like, wow, like, why wouldn't you do this? It's, it's silly. Like, I don't understand why people wouldn't do it. Except for the staffing thing. Like we know. Often, like I said earlier, content design teams are understaffed. I run into, you know, people are like, I don't have time I'm on a content team. I don't have time for this. There's always time if you look at your schedule, and the less impactful work you're questions, what you're testing, you can do it fast, you can even doing, you can take that off your plate to then do a little bit of content design every sprint because it's worth your time. It shows you know when you my coworker Trey had that $2 million. The Pm is like we're saving $2 million. And she's look at you in fact like, oh, yeah, well, I got a Kudo kickoff. That's, that's

Erica Jorgensen:

on fairly quickly. Yeah, and if you great. But I gotta go to pick up and I was like Trudy $2M like, She's so busy. She couldn't even enjoy the fact that she was driving all this impact that but that's, that's a glorious thing. And something to not humble brag, but to like preach from the mountaintops, I hate continents driving impact. That is saving money that is making money? Why wouldn't you want word of that? I am just baffled. Why, especially in the C suite of some companies, and people just don't understand the role of content. I'm hoping my book can make a little dent in in that and help people understand that content is really at the heart of our customer experience. So why wouldn't you want to go out the gate with a new feature knowing or new product or new title or new name of anything? Knowing it's going to work with your customers? Why wouldn't you want to leverage that power? For the sake of your customer experience and your company's bottom line? It's an it's a no brainer. operationalize it on, like, create templates for your your test questions like knowing which stakeholders to include which to exclude, so they don't slow you down? How do you decide what to exclude and who to include? Well, you've got your core, there's always product manager, who's probably you know, leading the charge for but the UX team is working on. In most cases, you definitely need to have at least your core pm involved. Let them know that you're doing content research. So they're not surprised. You don't want to blow things up. And oh, yeah, say, oh, surprise, I've got this insight. And now we need to implement it. And it's going to strip the rest of the screen for everyone for the rest of the month on engineering. If you have it depends on how your your feature teams are, are set up. If you're in you know, pods of content, research, design, engineering PM, that's typical. You don't want to surprise people. And that's, I think that tactfulness that diplomacy that you need to implement Linnea or need to keep in mind reading during content research is really important. You don't want to insult the content creator, which might be yourself or you don't want to you know, undermine your authority as a kind of creative, you're testing content is already live. I've had general managers say, Well, why didn't wait. So we did a Content Test. And we found reason to improve the current experience. Like why wasn't the customer experience already awesome. It's like, well, we're not really stabbed to put the time and effort into every project. You know, I think you don't have to backtrack and explain yourself. But keep a core group of, you know, like a tight circle of people involved in the test, or study. And then be careful as you promulgate your results, as you announced a way to learn, make sure that people are not left out. I think if you get into a cadence of like a regular drumbeat of research, where people are expecting results, like every Friday, we did this at Expedia, where we shared results every Friday, and people were waiting with bated breath, like, Oh, we're gonna get some research results room, I can't wait to see like, people actually excited for an email. And that was really cool. Make sure that you don't have gaps in your communication. That's important too. But I think you can get templatized templatized your studies, read up your results in a quick and easy way. Like we made a wiki for that at Microsoft. But we also had a really cool platform were sort of like, through like blogging, like we had, the title that we had stakeholders, we had the core results, we have the key insights that we had we tagged it, you know, metadata, we made it as as easy as possible by having, you know, artifacts, operationalize it as much as we could to keep it simple. And we're just very careful and tactful about. If you'd have a result that is surprising about currently customer facing content, you'd be really careful about how you communicate that and don't insult the content person. Don't throw your product team under the bus, but approach it with a growth mindset don't well, you know what we did? We do their best at that time with the resources we had when we did the initial launch. Now we have tools to make it even better. You know, fan it that way. So people in the C suite or senior leadership aren't upset by Some improvements, like no one can be offset by an improvement. But it makes sense. Does that make sense? You'd be surprised that people were like, Well, why don't we have a world class content anyway? Like, stuff up your content, things people would say to that.

Zohra Mubeena-Mutabanna:

Okay. So I think everything that you said, I think my biggest takeaway is keep it simple.

Erica Jorgensen:

As much as possible, as much as possible, right? Yeah, yeah. And create a roadmap of rigor chan test everything all at once you can, it's going to be hard for you to identify your most important content if you don't have your content house in order or your product housing word. That's like, if you're not measuring content performance, if you're not measuring the customer experience, that makes it a little trickier. It's not impossible. But so people come up with lists of like, 100 pieces of content or 100, content elements, they want to test. And I'm like, Whoa, pace yourself. Well, you're versus like, until we have content research teams, that's going to be hard to accomplish meeting, or at least being on the future. But you got to pace yourself and write up what you can do. And be careful with the Make sure you write up the report so you can share them so people can read your find your insights. I think the coolest thing about the contract research program that we spun it up Microsoft was we could see oh, a 400 people read my study. Oh my gosh, like that's, that's awesome. To have it shareable across the whole company was invaluable. So people in marketing could see it. So people, then PR or social media could also benefit from our product research, helped people all over and then that helped us make the case for like, Oh, can we have a couple$1,000 to support the renewal of these platforms that we're using to do the testing? Hell yeah. Because it pays for itself many, many times over with people across the company benefiting from it, of course. So it was sit down.

Zohra Mubeena-Mutabanna:

Great insights. Erica, don't you think I may have missed on you'd like to add?

Erica Jorgensen:

We talked about stakeholders. You know, I think one thing I'd love to mention is when you give a hat tip to is Sarah Richards, who wrote the first book about content design, and she isn't a strong advocate for simple language. Like we're talking about keeping things simple, simple language is often denigrated as too simple or dumbed down. To be political. Very briefly, I think Donald Trump is responsible for some of that attitude, honestly, because he talks in like, you know, language, that is what second grade level if he were to evaluate it with flesh, Kincaid simple language is not not dumbed down, it's powerful. When you're clear, and your customers understand what you're trying to say and communicate. That's beautiful. It's not easy to be simple. It's not easy to write short.

Zohra Mubeena-Mutabanna:

I think they're there to the other, I

Erica Jorgensen:

guess there's two things I'm saying is like simple language is really important, and not easy to achieve. And then I guess another thing I would love to mention is that often content research will the results of conduct research to show that sometimes your content needs to be longer, in order to be clear, sometimes you need to be longer brevity is great. But sometimes, like we saw at the premiere of Blue Cross, where we had to explain what silver meant, not a tooltip. Because it is so important, we had to put it on the homepage for everyone to see, sometimes content research shows that you need to explain things in order for them to be clear. So brevity is not always possible. It's a goal. It's a great goal. But some clarity is more important and brevity. And I see that time and time again, and the content research studies I've run, and in those run by my colleagues, I see. Sometimes you got to explain stuff, being respectful to your customer. And yeah, so I think I keep coming back to impact and showing impacts to senior leadership in companies like explaining that simple language is powerful, does not always go over well, especially if you've got people to MBAs, people who've got law degrees, don't always want to hear that simple is best. But that's often the case. And that's, again, like it's really simple when you get down to with it, but doesn't always go well go over well, we're being communicated in a corporate environment. So beware, beware, I have a lot in my book about being careful about stakeholder management and communication around content research. It's, you gotta gotta be real careful. But then it's well worth the time while we're carving out time to do it. Even if that means getting rid of office hours or other content design practices or other cutting out a couple of meetings a week. I bet you can do that. I bet every team can do that. Well, yeah, that's a

Zohra Mubeena-Mutabanna:

good food for thought.

Erica Jorgensen:

Oh, there's truth is there wouldn't be a podcaster that Ruth is sending a little bit and say hi to Rufus. For us, Rufus is my rescue dog who is really good at guarding my house. Sorry.

Zohra Mubeena-Mutabanna:

I know he had he had to chime in the fact that we

Erica Jorgensen:

love you. So I'm a mighty Yeah, he's, his knees are simple. He just needs to grow the house. But he does it too much, sometimes too much. He's you. You're welcome on my show. He's a good guy, I think. Yeah, I think we're getting the males come in or something. Sorry. Yeah,

Zohra Mubeena-Mutabanna:

that's true. That's totally that's totally fine. Thank you so much, Erica. For for your time. I wish you all the best. And I can't wait to get my hands on your book. Because what I have is in the digital copy or your edited over? Yeah, I have the digital version that Louis shared with me.

Erica Jorgensen:

Yeah. And we'll, it'll be out on the 11. Rosenfeld media. I my hat's off to them, and they were super supportive during this reading the book and they've got a lot of other authors on Andy Webflow. Michael Metz, Natalie Dunbar. They've got a lot of great authors on my roster who I encourage people to check out their other titles because they're all product focused and design focused. And I turned to my time and time again, and I've got a library that are assigned, we know.

Zohra Mubeena-Mutabanna:

Thank you so much, Erica. I'm really, really happy that I got this opportunity to chat with you. And thank you for your time. Thank you. Good luck.

Erica Jorgensen:

That's it. Thank you.

Zohra Mubeena-Mutabanna:

Take care. Bye bye care. Bye. Subscribe to the podcast on your favorite app, such as Google, Apple or Spotify. For the latest on my show, follow me on LinkedIn or visit me at w w w dot inside tech comm dot show. Catch you on another episode.