Quality is critical to satisfying your customers and can be the make-or-break factor in loyalty and business growth. Loyal customers lower operating costs, allow your organization to charge higher prices, and improve long-term revenue and profitability. Delivering high quality products requires a shift in mindset and focus on the entire customer experience. Join this session to learn the skills needed to deliver a premium CX and help your quality engineering team build end-to-end test coverage that delivers delightful experiences.
Hello, everyone. Welcome back to our final Experience session of the day. It's so great to have you all here. I am excited to introduce Debbie Levitt, who is going to be talking to us today about: Using customers' definitions of quality and done. Before we get started, I'm just going to cover a few housekeeping items. So if you have questions throughout the presentation, you'll see that when you minimize your video, you'll be able to find a Q&A panel. We'd love it if you could put all questions there so we can prioritize and get to them at the end of Debbie's presentation. For comments and any discussion, feel free to use the chat, I will also be posting a link to a quick poll we'd love you all to participate in. So I'll be posting that in a moment. And you can find Debbie’s slides on the right side of this sessions page when the video is minimized. So, again, we'll leave time in the end for Q & A. And with that, I am so excited to welcome Debbie.
Okay, I should be unmuted. Thank you so much to Leah, Katie and everybody at mabl for bringing me into your world today. Yeah, if you can, there are a couple of little polls on these right there on menti.com 4931 4315. And go ahead and take that two-slide poll while I introduce myself. And we'll see the results later in my presentation. My presentation should run about 45 minutes, and then I'll use the rest of the time for your questions. If I don't get to your question, or you'd like to ask the question more privately because it's something going on at your workplace, my emails are on every slide. So please, please get in touch. It would be great to meet more people and chat with more people about things.
Of course, speaking of meeting people and chatting with people, there are certainly ways that we can stay connected. If you end up liking at least some of what I say you can find me on LinkedIn, as Debbie Levitt. I've also got a YouTube channel called Delta CX, where I talk everybody's heads off about CX and UX. And we do have slack and discord, communities. And let me introduce myself a little bit. For those who don't know me, which is going to be most of you. My name is Debbie Levitt. My company is called Delta CX. We are a full serve service CX and UX agency and consultancy. We do projects and training and consulting projects, we do a lot of business design and change agents stuff to try to get companies working together better on their products on their evidence-based decisions. We need more of that. And so and so my background is as a strategist, researcher, and designer, in CX and UX for a zillion years, me and the dinosaurs. Clients like to call me Mary Poppins because I fly in to fix everything. I can sing a few songs and fly away to where I'm needed next. Notice there's a little camera icon on the bottom of my slides. That's just to let you know that slide is not going to animate further for people who love to screenshot slides. So you are getting the deck so you don't have to take screenshots. And also note that I'm saying CX and UX interchangeably because I believe that when they're done well, they are the same thing, the customer experience, and the user experience, I don't really differentiate between those to me, everybody is a customer. There are potential customers. They're a first-degree customer, second-degree customers. So I'm not going to die on that hill. But I'm also not going to spend a lot of time on it today. Just know that if I'm saying CX, I mean CX and UX, and the user experience as well. And hopefully, everybody's been taking the poll. We'll see those results later.
So hey, are you living the Agile Manifesto principles? Because we claim that we're focused on customer satisfaction, supporting and empowering our teammates, allowing time in our process for good or great design and being lean by cutting waste and reducing work that we shouldn't do? But I've pretty much never seen these happening anywhere I work, or my friends work. Is this happening to your team? Are people really using retros to identify problems and waste and then truly fixing those problems and cutting that waste? How about principle, we welcome changing requirements even late in development. I wish, right? Can you imagine if we stopped the project when we realized it was garbage or not quite right for our customers? That would be agile and customer-centric.
Good design, or how about some great design, it's important. Whether or not we invest time and money in qualified professionals in our product design, there will be a design, we should make sure it's a great match to customers' definitions of quality and value. Whether or not we invest time, money or qualified professionals in QA, there will be QA, bugs exist, and if we don't find them, angry customers will find them. And I bet they're tired of dealing with broken stuff that your company decided was good enough. And whether or not we invest time, money or qualified professionals in making our products fully accessible for every disability diagnosis and condition, there will be customers with accessibility needs trying your products, designed QA and accessibility can't be sprinkled on later or hoped for they are components of the customer experience.
You choose whether they will be great, or a source of complaints, frustration and attrition. Customers want and expect five stars out of five quality. They don't care if you have 10 developers or 1000. They don't care if you have an agile or product mindset. What's the last website app or system that you use that you really loved? It really served your needs. It solved your problems. It was like they read your mind, Hard to think of one right. But would your users put your products on their list of what they really love? Oh, probably not. And when you use a system that frustrates you, you're probably not thinking oh, how much Sprint did they spend on that? You're actually thinking who built this junk? And that's a good question, who researched with users like you to learn your habits, motivations, needs, and tasks and then architect and design for those? Was this product tested with people like you before it was unleashed to the public? Or did we go with our best guess and say it's, everybody, say it with me, good enough? We fill our world with catchy sayings that we use as excuses when we're not agile, lean or building what customers would find five stars out of five, like, just ship it often without knowing if it's the right solution for our customers. We'll fix it later. How often do we really fix it later? When is it later what will we have to delay and what will it cost? Why are we releasing something we know now is broken and will have to be fixed later, users will have to figure it out. Or they'll flood you with support tickets or they'll give up, maybe even cancel. Fail fast, there is nothing cool about failing fast or slowly when real paying or trial customers have to struggle with that failure. If you're not embarrassed, you shipped too late. Does that mean our goal is to be embarrassed by our product? I think most of us would prefer to feel proud that our product actually matches the customer's needs and tasks. Lean is the least we can do to get us to the next step. Lean disagrees. This is a bastardization of lean that seemed to come from the Lean Startup book. Anybody who has studied Toyota's original approach to lean or Lean Six Sigma knows that Lean is about identifying and cutting waste. Fake Lean is wasteful and takes us down the wrong paths, which means I would like to cut fake lean for not being lean, moving fast and breaking things. Sure only if you fix those things before the public struggles with them. Our old friend, it's good enough, for the engineering team, for stakeholders, or for our paying or trial customers.
We must stop praying to the false gods of speed and give our customers broken things. We must not use any of these as excuses for low efforts or failures. All of this is high waste, not lean, not agile, and quality code for a bad feature that doesn't match customers' real needs and tasks. That's the wrong way to go. Putting band-aids on symptoms without finding route problems and fixing those problems. It's the wrong way to go. Assuming customers are happy with minimum viable ideas that they have to figure out and keep figuring out when you release another experiment. Wrong Way to go. Maybe we feel better when we say this stuff to each other. But we're only fooling ourselves. If we're not delivering five-star quality everybody will notice, read your app reviews and online ratings, and check surveys and NPS, even better and more actionable have CX and UX researchers observe and talk to customers, ask customer support what's on customers' minds this month? How's your stock price? Is the Media Writing about you? Did we have customer-centric success criteria for our projects, and we're meeting or beating those standards, or excuses? And let's go faster has dropped us into cycles of risk, waste, customer dissatisfaction and attrition, any of us can change this.
And it's a great opportunity for CX and UX, I'm sorry, CX UX and QA to be partners. Hey, QA cousins, I'm from CX and UX, you and I are often the bearers of bad news, something isn't working, or something isn't going to be right for customers. Maybe we should slow down and take another look at this. So we don't have to fix it later. We're sometimes seen as the bottleneck, you too, right? Everybody hopes that we're just going to greenlight and rubber stamp whatever they've given us. They don't want to hear that we found bugs dead or something really bad. But time and time again, we find those things, we should be speaking up and saying something because we're supposed to care about all aspects of product quality and customer outcomes. We're supposed to be empowered teams.
We're going to need to get better at finding risks and problems earlier, speaking up and mitigating those risks. A great model for examining and calculating the real costs of our guesses, and mistakes. And what I like to call customer periphery, the opposite of customer centricity, is the cost of poor quality from Lean Six Sigma. Start with what the company spent internally on the project. And all the time it took multiple teams. How about marketing and sales efforts? Did anybody write training on these features? What did it cost us to delay and fix things, you can start to imagine or calculate all of the time and money that was spent on this project. Then we also have the external costs, you might need to work with someone from customer support to see what complaints, tickets or issues go with these features. Can we find numbers on customers who cancelled or downgraded? Do we have numbers on MVPs? And especially failed MVPs? What about environmental costs for physical objects? That's also important, what's the environmental cost when everybody needs a replacement part or a dongle? This is how we change the conversation.
Companies are sweeping the costs of poor quality under the rug, we have to document parts of the project that could create any of these wastes or costs, add it to the risk register or risk documentation, and bring it up in retro because Agile is supposed to be about constantly reflecting how things went, and then actively making improvements. to nudge our company toward customer-centricity. We need to do a little digging on past projects. We all know of at least one project at our company that was a disaster, time wasted. Money was spent, morale was awful, and customers are unhappy. What did that cost? Do we know? Can we estimate it? We might have to mention that project here and there to remind people of why we need to have higher standards for what gets released to customers. And the ROI is on this screen. These are all of the costs that we will save by doing a better job the first time, the next time.
Okay, hopefully, the poll results will pop up here. If not, I'll flip over to them and check them manually. live poll? Well, we'll see if they pop in. So usually I say raise your hand if you've ever known while building something that it wasn't right for target users. Everywhere I speak, every hand goes up, you know, you're building something of poor quality that might have to be fixed or undone later. But nobody feels empowered to say anything.
Creating a higher quality product means spending more time and money upfront for the customer's advantage and for our advantage. But instead, somebody says we'll fix it later. And of course, we hide that fix in some other budget and call the original project a success. Isn't it amazing that we never have time to create a better product now, but we'll have time later to stop a future project to fix this old one. If our systems aren't right for every user interacting with them, can we congratulate ourselves for having an efficient engineering process? If Agile is supposed to be about welcoming and change even late in a process. Why do we reward silence and avoid holding people accountable for disaster projects? Workers in teams who are afraid to stop a project that's likely to fail, signify culture problems that will need improvement.
Okay, sadly, my poll didn't come up. Let's see what people said. Okay, you can't see it. But 18% said never. 34 people responded to the poll, 18% said never, I almost believe you, 21% said rarely, okay, remember it’s anonymous, 38% are the largest bar here says sometimes, 21% are often and 3% of you, roughly one of you said constantly. So that is one I believe. Let's see if the poll will come up here. There should be dots coming up. Come on, where are my dots? Oh, my grumpy polling system. I apologize. Let me see how people answered this poll. Let's see. 33 people answered it. Most of the dots are on the far left side with a bunch of dots in the yellow area.
Okay, so let's talk about this slide. Traditional flavours of agile are correct in saying that we must collect feedback from customers to know if our product is high value and considered high quality by users. But agile and the MVP, typically imagine that we're going to have to collect that feedback. After releasing the product. Whether that's an experiment or an MVP, it's usually something engineering is building, merging and releasing. And that's when our risks and costs are absolutely the highest. And since agile excludes CX and UX, or at least it usually does it excludes CX and UX, and the processes and the people who do the work. Agile doesn't understand that we in CX and UX can get that feedback through early concept and prototype testing. But teams say they're agile while burning lots of money and time to learn very late in the game, sometimes weeks or months after release, that we've gone in the wrong direction. Learning this late that we have to fix or rework our product is I think waterfall in disguise. Agile can pride itself on doing smaller batches of work and releasing products more frequently. But if you're continuously releasing garbage and not finding flaws until weeks or months later, your boat just went over the waterfall. So I would say the very earliest moments of any project planning have to be focused on customers' needs, and CX teammates should be part of that project planning before the team approves features that have low or no customer value. Who's going to save us from risks and mistakes?
When teams don't have CX or UX practitioners. This often falls to you QA engineers, you are the first people testing what we created from the perspectives of the end users. Sometimes our teammates haven't considered end users' perspectives at all. Our teammates were focused on business problems and business goals. Now QA is in a tough spot. Do I tell everybody we build the wrong solution? Can I go beyond just checking for technical bugs and tell my teammates I think we're building the wrong thing for users? Should I warn teammates that this can fail in smaller large ways once it's in customers' hands?
Risk Mitigation is one of the main purposes of CX and UX processes and practitioners. Aren't those the artsy fartsy people who keep dragging me into workshops and telling me to just have empathy? Sadly, they might be but I'm doing everything I can to burn that time-wasting workshop crap to the ground. When you think about people who work in CX or UX again, you might be thinking these artsy fartsy hipsters are people who make screens pretty but most of us are not artsy, fartsy hipsters, and there's a lot more going on in CX and UX than many people realize the core and foundations of our work, our cognitive psychology and human behaviour, not art. I have a degree in music and an MBA, not an artist.
So let's talk a little bit about the main process that UX practitioners of all types use. It's user-centered design, sometimes called human-centred design. So you might see UCD or HCD. And there are multiple phases here, including research content, strategy, information architecture, interaction design, visual testing, visual design, and of course, monitoring it after it gets released. And each phase has multiple tasks that can be done. So CX and UX professionals must approach this process strategically.
We're not always going to be able to do all of these things, as deeply as we would like, we do have to consider time and budget. But we would like to fight for more of that than we tend to get. So we have to think about what we want to do, how long it's going to take, and in what order we'll do things because the order does matter. Research is going to influence and inform everything that comes after it. Information Architecture informs interaction design and things that come after it. Now I've made this look linear, but it's fluid and often cyclical. So for example, testing our interaction design concept often leads to iteration. This means another round of information architecture and interaction design, then we should test again before saying that this design is finalized. So in this sense, we kind of spiral our way down to the best execution and solution. But you'll hear about other models as well, you might hear, learn, build and test. Now, of course, that's an oversimplification of what it takes to learn, build and test. But that's all baked into the user-centered design, learning, building, testing, improving and iterating. And you can also see how models that start with the build are missing a lot of early learning we should do if we care about building the right things for our target audiences. You might also hear, discover, define, design, validate and iterate, double diamond is a derivative of user-centered design.
Design thinking is a terrible and controversial micro derivative of user-centered design that has stripped away most of what's done in each phase and task. But ultimately, all of these models, good or bad, are processes that can be superimposed onto the phases and tasks of user-centered design. And many of you are already familiar with that little spot on my screen that says interaction design. That's typically our wireframes and prototypes that indicate what is this and how does it work? How do people move through it? How does it lay out, you might not be familiar with everything else in our process that makes that interaction design work, well.
If we care about quality, then we need to care more about the quality of the process that gets us where we're trying to go and the outcomes we're trying to achieve. And it's also why when you ask your UX designer to just guess at some screens make something pretty. Without research and testing, it's another high-risk guessing adventure that could lead us down the wrong path. I would say it is the Lean sin of underutilized talent when we only budget enough time for UX to quickly guess at screens when there is so much scientific and risk-mitigating work in what UX does. So we claim we want to be evidence-based and we say we want to make informed and intelligent decisions. And we claim we want to work quickly and efficiently. But we don't stop and seriously consider how inefficient our team is.
When we work from guesses and assumptions. We can build what a stakeholder tells us to build. But if it isn't based on great customer intelligence, the risks are high, we could end up being far from agile and far from lean, when we have to make changes, redo things or in really bad cases roll back to an old version.
So how do we know what customers need and what their workflows look like? Your company probably uses a lot of surveys to learn what customers want. But have you ever seen these surveys they send out, they're often highly flawed, and they ask people to guess if they would like an idea they haven't seen or predict something they may or may not buy in the future. I saw a guy post to LinkedIn today that his company sent out a survey asking if you would give us certain information in our form if we asked for it. 80% of people said they would when they added those fields to the form and made them optional 0.4% put that information in rimshot. So things like this are not good evidence. So the question is, then how do we get better evidence? Research, in my opinion, is the most important user centered design phase. I know a lot of people think it's designed because that's that tangible thing that they can see and get and try and talk about. But research is the most important, and we've got to use our current and potential users. It's not user centered without users. It's not customer-centric without customers. We primarily want to observe users because you learn a lot more about who people are and their tasks, contexts, perspectives, behaviors, needs and goals by watching them. Interviews are excellent when we ask the right questions the right way. But sometimes what people say and what people do aren't the same. Statistics and quantitative data are nice, but there's no substitute for observing and interviewing users deeply understanding them and getting that qualitative data.
UX wants to know who, what, where, why, when, how and not just how many. Every step of research is important. I find a lot of people don't even know what a CX or UX research process looks like planning, writing interview and task guides, recruiting the right participants and the right number of them, executing the research, interpreting its analysis and synthesis, evaluating what we found arriving on actionable insights and making suggestions. You can see how different this is from: We asked a few people what they like, and we'll build that. So I like to say testing is the QA of UX. We shouldn't skimp on or exclude QA testing for our code. Why do we rush out untested concepts? Usability testing helps us validate, invalidate and improve our concepts.
Research and Testing help us know what is quality and valuable to users way before a line of code is written. As an example we did at my company, we did a research study in 2020. And it was a live remote observational study, we met people on zoom, we had them share their screens, and we wanted to learn more about how online sellers list their items for sale on a well-known platform. We asked participants to share their screens and show us how they list one of their items for sale, we found an incredible amount of information. We found that certain fields on the page were meaningless to people. They didn't know what they meant. They weren't sure if they had to fill them out or if they didn't have to fill them out or what information went in them. We learned that they put bandaids and workarounds in their process. They had outside tools, systems, notes, cheat sheets, browser plugins and more to try to improve their task completion. They found inconsistencies and inefficiencies. Why does the forum ask if your downloadable art file PDF comes framed? They found error messages that didn't make sense to them. They couldn't understand what was going wrong or how to fix it, you're unlikely to discover these types of details about behaviors, workarounds and tasks through a survey. A/B testing doesn't tell us this. NPS doesn't tell us this. And people tend to be less honest in focus groups because others in the room are surely judging them.
By watching indeed, how individuals from our target audience do things now we can deeply understand their tasks, every single thing we learn is an opportunity to create a better product or service. That might be an incremental improvement, or it could be a complete reinvention that jumps way past the competition. These are risk-mitigating opportunities and an opportunity for you to partner with your specialized CX and UX researchers. These are people who know how to do this right the first time, they're not guessing at it. And it's so important to understand customers' needs and tasks. We want to create the right products that meet or exceed people's needs while making sure these products are intuitive and easy.
No unhelpful error messages. So I don't have a lot of time today to go into depth on how CX and UX work best in Agile environments. So I'm gonna hit a few of the high notes. Just know I have over 600 hours of content on YouTube. So if you're curious about my perspective on something, you can always go to YouTube and type Delta CX and whatever you're curious about, like design thinking or agile, and you'll get a whole bunch of videos of different lengths and you can watch them. If you find I speak clearly, you can watch them at 2x. So agile typically defines quality as things like improved software performance, fewer no bugs, meet stakeholder requirements.
Agile typically has definitions of done that look like completing the work we planned to do, it fits business objectives, it's coded, documented and tested and deployable. I got that out of order. These definitions of quality do seem to be self-reflective. Hey, did I do what I meant to do? And it's looking pretty good. I declare it quality and done, but this is where we need to introduce customer-centricity. Congratulations on writing working code, but does our product match users' and customers' definitions of quality and done? Does it fit their tasks and needs? Did the system optimize or complicate a workflow? If engineering code is one high quality and feels done, but the concept doesn't match what customers need. Is it still quality? Is it still done? Or was all of that time spent engineering the wrong so-called solution a giant waste of time and budget? Make sure you're working with your CX and UX partners to have them write up their definitions of done, add to engineering's definitions of done and add CX standards and principles to acceptance testing.
Engineers sometimes complain UX is siloed. And they're bad partners. Okay, but are you reaching out to them during your processes? Probably not. So this can get better in both directions. CX and UX need to involve you more, you need to involve them more, everybody can do better, including each other at the right times. This common template for agile sprint goals says: our focus is on [achievement/outcome]. We believe it delivers [benefit/impact] to customers. This will be confirmed when [event happens]. Supposedly this helps teams focus on users and outcomes. But that's not a guarantee mostly because of one risky word here. Do you see it? Believe imagine work goals being based on something someone believes a guess or assumption, shouldn't we know it delivers certain benefits because the concept has been validated and vetted? Before we're planning engineering sprints around it? Every time we notice that we are guessing, assuming, hypothesizing, making things up or working based on what someone believes.
Our CX researchers should study whatever it is, so we can know, step in when you hear people wanting to work from guesses. So I've rewritten sprint goals to be more customer-centric, our focus is on [user/customer outcome]. CX or UX has confirmed that it delivers [benefit] to [customer]. This benefit will be measured by KPI or metric. In the old version, we're hoping to confirm later that there was a benefit to the customer, there might be zero benefits. And we're going to find that out late after a lot of expensive engineering time. That's not agile or lean. CX can confirm the benefits and value earlier through user centered design. Now we're not hoping to find out later if something delivered benefits to customers. Now we're just going to measure success and performance. MVPs are rarely customer-centric. When someone mentions the MVP This cartoon is typically brought out, it shows that you wouldn't build a car by starting with one wheel than two wheels, then the car's body and finally a full car. The cartoon shows that an MVP would be like building a skateboard first, then a kickscooter, then a bicycle, then a motorcycle. And then finally the car you want it to build in the first place. In reality, neither of these is the MVP. If we're starting with sad user faces, we're getting this wrong.
In both of these examples, the early products you're releasing to users aren't close to what you're trying to build. If users need a car, neither a single wheel nor a skateboard will be a match for their needs. The MVP was originally intended mostly for startups who had no UX staff, sometimes no product managers, and usually no research on target audiences' needs, tasks, workflows and things like that. So startups who had to try to impress investors wanted to show they could release software quickly. But he's the MVP right for us, it's time to think more critically about it. Users don't want minimally viable anything, make sure customers would give every release five stars out of five, let's not release something likely to create angry faces or sad faces. And another problem here is that since many companies are feature factories, your CX or UX team is typically told, to build a skateboard, often without knowing if the car will even exist, or what users need that might look like a car. This means that we're designing for the now and in the box we've been put in. We're not an ape, we're not able to think ahead or future-proof our project and design for that ultimate goal or solution. So reframe the MVP. If you want to release a slice of something, make sure it delivers maximum user value instead of a skateboard. It's a small car. The MVP is supposed to be a tiny slice of our solution to check that we're going in the right direction.
But user-centered design handles that for us by creating realistic prototypes, and then testing them on current and potential customers, we don't have to go through cycles of engineering to know if we're going in the right or wrong direction, we can gather better evidence before we spend time on coding and QA testing. Now, this is going to scratch the surface. But I want to spend three or four minutes talking about feature oriented versus task-oriented processes.
If you're curious, you can Google these things, or I've got hours of video of them on my channel. So let's scratch the surface a little bit because I've got a few more slides. And then I would love to take questions, I guess I've got about seven, or eight slides.
So many teams start with requirements, user stories, and feature ideas. These ideas are often a guess, assumption or hope based on the desired business outcome. Maybe these are ideas based on how we can nudge user behaviors to meet business goals. Hey, let's get people to click on that button more, ideas around functionality become the features that we plan and roadmap. We feed those features into the product world. And we ask UX designers to create wireframes and prototypes for those ideas. So our solution is based on our pre-decided features. This approach is called being feature-oriented, you decide on the features first, and then everything is planned around that. If these features aren't a strong match to customers' needs, tasks or realities, we may create a lot of delay, waste and risk thinking that we're fast. Hoping to find out later if our guesses were good, is reactive and risky. It could be weeks or months until we get back survey results or other analytics.
Feature-oriented processes are easy to spot because they usually start with the build such as build, test, and learn. They might be a variation of an idea, quick design, build, test, learn, iterate, or evolve. Feature-oriented processes assume that it's not worth the time to try to learn more before just trying your first idea. We assume we'll guess well what people need. And we don't mind showing our cards to the public as we go. We'll find out later if there was any customer value. And we might be surprised.
So as an example, there's a situation where we never would allow build, test, learn, how about buildings and bridges. We don't start construction on buildings or bridges based on an idea someone has, we have architects, structural engineers, and other specialists learning and designing before anything is built. Now the opposite of this is called task-oriented design. It's been around for decades, and it's the opposite of being a feature factory. task-oriented design embodies customer-centricity. What we research and learn about real users' tasks are at the starting and focal points. This requires solution-agnostic, generative qualitative research, I know lots of words, Google them If you're not familiar with generative research, you should know quality qualitative research, like not a survey.
So remember, we're not checking if people like a concept when we're doing task-oriented design, we are learning who they are, how they do things now and what their unmet needs are, as I like to say on my YouTube show, people, context, systems, context is so important, and we often ignore it. So remember, we don't have a solution in mind while we're doing that early discovery or exploratory research. Then, from what we learned, we can write problem statements and prioritize those problem statements to figure out which user needs are our best product or service opportunities.
After prioritization, CX teams can work on concepts and are task-oriented user-centered design based on that good evidence and that customer intelligence that came from the research, once CX has done their architecture and design, testing iteration, vetting and validating a solution. Now we know the features derive from the solution and task-oriented design. Now you can take that solution and break it into features stories, JIRA tickets, and backlog items, however, you like to do it. Feature factories start with ideas, possible solutions or guesses and assumptions about target audiences. Feature factors discovery research often aims to find if anybody likes this idea we have, task-oriented design requires us to be solution agnostic.
Don't start with ideas or hypotheses, work with CX researchers so that the team can have the knowledge and evidence that will drive smarter decisions. The required time and money are worthwhile investments that more than pay for themselves with customer satisfaction and loyalty. This slide did not turn out the way I expected. Tri-track agile imagines three main streams of work going on simultaneously.
The first track is working on research strategies, opportunities, and what we call the artifacts or documents from research. These feed a second track, which is research-informed, being data-driven, sounds good, but we must also include our qualitative CX research insights and not just numbers.
The second track works on road mapping, architecting, designing the product and testing and improving it. Notice that research informs roadmaps that would be task-oriented design. As that work is ready, it's fed into the engineering world. This requires lots of planning, engineering wants to sit around with nothing to do waiting for CX, and CX must estimate their own time early in the planning. And in every phase, non-CX roles should not guess how much time CX workers will need for tasks that many people don't understand.
Everything related to agile, scrum, engineering and delivery live in that third stream, delivery. That way engineering can do their work with whatever approaches or styles they prefer. And it doesn't affect how the product and CX are working. A diagram like this also highlights why it's important to have specialized CX researchers separate from specialized CX architects or designers. Some companies are trying to combine these into one job, but it would be hard if not impossible for one person to do two different streams of work at the same time and do them well.
Customer centricity requires us to care more about the quality of the work being done. We want world-class research and world-class architecture and design work done before World Class engineering and delivery. Not to mention that it's almost impossible to find someone who is amazing at all of those things, so be careful. Not everyone's telling you the truth.
Most Agile and Scrum infographics show the process starting with the product backlog, that's what engineering will build and in what order, but CX architects design what ends up in the backlog. Before design work, CX research users, their context support tickets and anything we need to find opportunities, pain points and priorities. Agile and Scrum infographics don't show user centered design. And that makes sense since user-centered design is two continuous streams of work that are separate from engineering, and anything related to agile, but we must consider the bigger picture and how we work better together. So your infographic looks like this. User-centered design takes place before vetted and finalized designs can be added to the backlog. Now you can complain that this is a big design upfront. But when else should design happen in every industry on the planet items are designed before they're built or produced. And while they're designed, their models are tested and improved. All of this is built into user-centered design, we will break out of our cycles of guessing assuming risk and waste by properly utilizing this process and the specialists who do the work. And please stop saying big design upfront. It's some weird name somebody made up for my work because they wanted me to not do my work. That's not good for morale. So stop trying to put one jack of all UX unicorns on your team and expecting them to do everything in the user-centered design process and especially in just days, stop telling them they're not agile when they want more headcount and budget. I
'm suggesting that CX and UX teams work in groups of five, three researchers with two architects or designers, all five would be fully allocated to one project or agile or scrum team. Sure, that's more of a future goal. But let's stop stretching UX staff thin and then telling them they're not agile. My team of five highly qualified professionals meet all definitions of agile. If six to ten engineers working on something for weeks or months is agile, totally normal, nothing special and not big, puffy engineering at the end, then five CX or UX practitioners working on something for weeks or months is agile and totally normal, and probably more agile than how you do it today. So make sure you're hiring qualified CX and UX researchers and designers until there are no bottlenecks. It only happens when leaders fight for budget and headcount. I wanted to end in a few minutes so that I can take some questions. I don't know how many questions there are. So I'll blow through a couple of slides here.
Hey, safe is absolutely the worst when it comes to UX. And many flavors of Agile and Scrum don't get UX. And that's okay. We're separate. I'm here to help glue it all together. But unfortunately, safe is the worst. They've got Lean UX out here by metrics and vision, which shows they kind of have no idea who we are and what we do. And zero UX leaders are nearly zero I should say, just to be fair, because I can get a little carried away on this one, nearly zero UX leaders and experts like lean UX. The Lean UX book, especially its 2016 Second Edition presented a model that completely disempowers UX, focuses on business goals, and prioritizes guessing and assuming over research, it's hard to figure out which is worse, design thinking, whatever that means this week, or lean UX, whatever that means this week, and nobody puts UX in the corner. But I say to everybody take heart, the safe infographic has customer centricity in the middle, and a double diamond, which is a derivative of user-centered design. So let's use that as approval.
Let's set off a happy confetti to remind us that when I see this, I'm not going to go by how safe says to do UX because they don't have this right. But I'm going to take it as approval of the user centered design process, and prioritizing customer centricity. Whether or not you use safe, make sure you're working with CX and UX experts who have monster experience on how UX can fit into Agile processes. flavors of agile, aren’t handling this well or correctly and their coaches aren't experts in UX. So what can we do? Instead of thinking of this as working with UX or adding some CX tasks to our process, think of it as customer-centricity. We're going to put users and everybody who's going to touch this product at the center of what we did, not what we think they need, not what they hope they need, not what we think is good enough. We've got to put these people at the center of everything we do. We research them, we design for them, and we test prototypes on them. And when it's right, engineering can code it hopefully once without change requests.
So transforming toward customer centricity means trying new things, new ways of working, new processes and habits, new collaborations, and getting it right that first time means a better direction for our stuff now, okay, it's not going to be perfect. I'm not suggesting we wait and wait forever until it's perfect, but we could do a better job upfront than we do now. It would save us so much time over the problems we have now the root causes, the rework of the UX and technical debt.
Let's drop faster, faster, faster and work smarter. Proactively recognise and mitigate risk plan better up front. There's nothing in agility or lean that says not to spend extra time up front to do a better job the first time. Agile welcomes feedback. They want feedback-driven requirements and changing as we need to learn, welcome, and recognise problems early, understanding the root causes and eliminating waste and defects in both our internal processes and the user's experiences, architecting for customers' needs and tasks and being agile shouldn't be opposites. We should be on the same team. No matter what an Agile Coach scrum master or stakeholder says the customer decides what is quality, what is done and what is good enough.
They define that value and if you don't use their definitions of quality, value and done, you're going to have to deal later with complaints and failures. All the mindsets and ceremonies on the planet won't matter if we're not delivering products that are easy to learn, easy to use and match customers' needs.
Joe Rohde, recently retired senior vice president of Walt Disney Imagineering said, one should not be asking the question, how little can we get away with but rather how much can we afford to do to be impressive? This is the quote to hang on the wall. And it's how we start reinventing the customer experience that the slides I'll be happy to take questions until they throw me out of here.
Again, get in touch through the Delta CX universe if there's anything I can do for you, and my upcoming book called ‘Customers know you suck’ should be available for pre-order this week. I'm still finishing up a few things and published live I'm shooting for December 2. So I hope you will preorder it and I hope you'll give it a read and give it a chance. Over to Leah.
Thank you so much, Debbie. You got a lot of love in the chat. And I have a lot of great questions waiting for you here. So we’ve got about 10 minutes. I'll dive right in. How's that sound?
Yeah, let's do it. Let's take those questions?
What would be your advice in a situation where QA does actively speak up when they feel the wrong thing is being built, but they are rarely or maybe selectively heard?
Yeah, when I've seen that happen, it's typically a sunk-cost fallacy. It's typically Well, we built it, we might as well release it, or we got this far. I think a good question would be, what can we do earlier to recognise those flaws if we don't have CX and UX usually, that would be my job as a CX, or UX professional, oh, we're going to build what for who? Well, that sounds like a pile of garbage. So I wonder if there is no UX or CX voice of reason, or somebody called UX is really just an artist and doesn't necessarily have that strategic angle, I would tell QA to try to get more of that strategic angle themselves, hey, you know, I'm going to be testing this thing later, I would love to see, I would love to see it as we're working on it. Now. You should anyway, I tell CX and UX people to show all of engineering our work as we go, not because we need your approval. But because that's part of a good collaboration, I say show engineering your work at least once a week. So I would say I wonder if you can get yourself involved in that way. And get yourself involved and be able to find some gentle way to speak up there.
A lot of UX designers don't always want to hear engineers' opinions on their work. So you know, which is not cool, we have to be a little bit more open to that. What I tell people is if you are going to give a UX person or a designer a suggestion, make sure you're looking at it from the customer's perspective and data if possible. So make sure you're not saying hey, UX person, that's a cool design. But I like this better. Like so many engineering leads I've worked with I have so many disaster stories, make sure you are as evidence-based and data-driven as you can be, hey, based on what I've learned about customers through this, I'm wondering if that might not be the best way to go for people. What if we looked at this another way?
So again, it's going to be a little bit of diplomacy and tap dancing. But I would say break that silo and get involved earlier. I would love to see you involved earlier. You know, good luck there.
Awesome. Thank you. I know, we actually spend a lot of our time as QA folks talking about how to work with developers. But this question here is great. So what advice on how QA or QA folks can work best with CX or UX designers? Where should we start?
Yeah, and I was just saying some of that. So like, let's say, let's say we're researching something, let's say this is a research phase of user centered design. I like to collaborate with people at what I call three points. Number one is planning. So, Engineering is devs, and QA and everybody else there. Hey, do you have any unanswered questions about the users? Is there something you wish we knew? Or do you have a guess or assumption about them that we should be clearing up with our research? So collaboration during research planning? Again, a lot of people aren't doing this, but I'm doing what I can to inspire it. Next collaboration when the sessions are being run, that means I do not want you running the sessions. I do not want you to ask the questions. No, please. But I do want you there. Either live watching the session and taking some notes or watch the videos later. And taking some notes that give you first-hand experience with users in their environments, stumbling and struggling, whether this is early research, to get to know users or evaluative research to watch them stumble through our crap. And then the third time that I want to see us working together during research is when it's time for me to report on it. And to be able to say, look, you know, here are the insights that we found. And here are my suggestions for what we should do about it. You've got to get that information. We can't keep that from you. That's another silo that has to be broken down. And so that is a time when QA should know everything I know.
Okay, you might get the short report version, but that knowledge has to be shared. Then during design, as I said, I love to see designers and architects, UX architects, as we call them, meeting with engineers, including QA at least once a week. Hey, here's what I'm working on. Here's how it's going, Hey, we tested it. Here's what we found. We're going to make these changes. Hey, you know, hey, engineering, so many questions that could be asked there. So that is my dream collaboration. I know that's rarely done. And then also there's collaboration in the other direction.
I love to see UX peeps getting involved, especially the designers getting involved when there's acceptance testing, and that type of stuff. Because while you're looking for some of your bugs we can be testing for doesn't match our design. Sometimes developers get creative. Does it match our design? Did we check for accessibility, and stuff like that? So there are a lot of collaboration opportunities that in many cases, our companies don't do at all. And we've got to do something about that.
Love it. Thank you. All right. I've got another question here. It's a two parter. I think that we've talked to the first part quite a bit about how to work cross functionally with QA, and CX and UX. But the second part of the question here is, what to measure and when to measure it. There's not a lot of other detail there. But I think any any context around, you know, metrics that QA folks can get more, more involved in or get more visibility into, it'd be really helpful.
For sure. Chapter Eight of the next book, measuring customer centricity. So yeah, there are a lot of things that we can measure. And I remind people to make sure that these are customer-centric measurements. And what I mean by that is, stop measuring time on the page. If you do measure it, it needs to be lower. We want our people to work efficiently. We don't want more time on the page. That's not a good metric. We don't necessarily want more screens. I just got stuck with an account at Morgan Stanley for reasons I don't want to go into. I've closed it. But to log in, I had to go through six screens, like what's your username and password? Hey, we're going to text you a code now. We're going to text you a code here. I mean, this was such slow motion. So what does that tell me? The metrics are page views. And so the customer experience has been manipulated to meet the KPIs. We have to look hard, and critically thinking, thanks y'all for our crappy crappy KPIs. So we've got to measure things that are successful from the customer's perspective.
What can a customer do in our system that shows that that was successful, they completed a task, they got something done for why they're here, if we have a SAS system, maybe it's entering data successfully without errors or error messages. Maybe they exported data or saw certain visualizations. If it's more B2C, maybe it's a very streamlined shopping experience where they spend very little time on our site and bought something that we can take a second look at these metrics and say, Cool, I know the executives love to hear time on page views clicks, people we tricked into joining the mailing list. I don't have any good words, I should say in English about that. But we need to also ask ourselves, and work with CX and UX, what are things we can measure that show that our customers are having success in our system?
Another great metric I love is the lower utilization of customer support, hey, we're not getting so many calls and tickets about that crappy feature. We made it better. So there's a mountain of stuff, that can be measured, get creative and step away from old old stuff like pageviews and time on the page.
Thank you, Debbie. We've only probably have time for maybe one or two more questions here. So just to kind of switch topics slightly get something a little deeper. How about how you handle that all the work? Developers, the other team members have done? That work was awesome, but their changes don't actually work with what's currently in the product. Do you have any advice for handling the kind of conversations for work that's already happened? And is shipping and how we can improve in the future?
I wonder if you need to read that question exactly. Because it sounds a little weird to me.
How do you handle that even if the work they did was awesome but the changes don't work with what we currently have in the product?
This sounds to me like a case of we fixed something but broke something else. Is that the impression you're getting? It could be yeah, sorry person to ask the question that I'm not sure if I don't answer very well. You send me an email or LinkedIn message.
But this is again having that more holistic look at the customer experience and customer-centricity. If we release cool feature x but cool feature y got broken along the way. Sometimes a question like this is about inconsistency in the product. Maybe you don't have a design system and different teams design things. One team released a thing that looks like this, one team released a thing that looks like this. And now to get similar things done, it's a different process for the user. We do that when we are siloed. And when we don't care about the customer experience, so again, if we can get our teams to transition more toward being customer-centric, and it sounds also like we need to have someone, I don't know what flavor of agile you are doing or pretending to do, but we probably need someone at that director or programme level to look down at these things and see, where do these things intersect? Where do we have a similar function somewhere else where we need to make it work similarly, either changing the old one to look like the new thing, because the new thing tested better, and it works better? Or to have the new thing work like the old thing because the old thing still works? Well. So these are a couple of my guesses as to what you're shooting for. I hope I'm in the ballpark. But, you know, contact me privately if you think I can answer that one better.
Awesome. Debbie. I'll close out with this one. Aside from all of your great resources, your upcoming book, your YouTube channel, are there any other resources you would suggest for QA folks to learn more about UX concepts?
Yes, number one, stay away from anything that says design thinking. Again, it's highly controversial. And sometimes people say design thinking and they mean awesome, complete UX processes. And sometimes they mean, let's all go into a room and get Stockholm Syndrome and play pin the sticky note on the wall. I don't do that. So you know, be careful of some of your sources. Find somebody who can be a mentor or coach like me, every Tuesday live on YouTube, I do Office Hours, ask me anything.
I would love to get QA people to ask me questions about UX or how we can work better together. I say I'm mentoring everyone, at the same time live on YouTube. questions can be asked anonymously. So I'm not going to be like ah, Jim from Intuit again, you know, so you don't have to know who you are. There are some seminal books out there that people tend to recommend, they tend to recommend 'The Design of Everyday Things by Don Norman', and they tend to recommend don't make me think which I think was by Paul Krug, maybe I got that one wrong. Anything that relates to human psychology and behavior is going to kind of be your real, or pure UX. Anything that looks like fluffy workshops is just going to be a little bit newer and more trendy. I'm trying to get people out of their workshop addictions. So those are some things if you want to go wild, a super guy named keeps a monster list of books. You just have to google like Darren hood, list books. And he's got multiple lists, like these are psychology books, these are UX books. These are Information Architecture books, these are, you know, I mean, the information architecture book is like 468 pages. And that's just one slice of one thing that some of us do.
Awesome. Well, I know we're a little bit over time. Thank you, everybody, for sticking with us. Thank you, Debbie, for joining us. This is a great engaging presentation. So that concludes day one of Experience. Thank you all for joining us. You'll see us tomorrow will be you know if you're attending virtually, we're kicking off right at 11am. Eastern, and if you're attending here in Boston, we're excited to see you in person. And registration opens at 11am tomorrow. So thank you, everybody. Have a great rest of your day. Thanks.