Experience Strategy Podcast: The Dark Side of Data Experience Design

To listen click HERE

In this enlightening episode, we dive into the world of data ethics and user trust with our special guests, Andy and Michael Madson. With extensive expertise in data experience design, healthcare, finance, and user experience, they shed light on the critical aspects of clear data communication and empathic listening to counter the darker implications of data usage. Together, we explore strategies for democratizing data literacy and simplifying complex data concepts to ensure transparency and trustworthiness. As we navigate the complexities of data ethics in an era dominated by artificial intelligence, you’ll gain valuable insights into how to combat the dark side of data experience design and pave the way for a more trustworthy and responsible data-driven future.

Voiceover: [00:00:00] Welcome to the Experience Strategy Podcast, where we talk to customers and experts about how to create products and services that feel like time well spent. And now, here are your hosts, experience nerds, Dave Norton and Aransas Savas. 

Aransas: Welcome to the Experience Strategy Podcast. I'm your host, Aransas Savas.

Dave: And I'm Dave Norton

Aransas: And today, Dave and I are joined By not one, but two Madsons, Andy and Michael Madson brothers. I mean, I'm going to tell you right up front because I had the question, so let's all get on the same page here. They are both brothers who are, I'm guessing, they have shared some pretty, uh, significantly overlapping interest for their entire lives, but maybe we'll hear more on that in the course of this episode.

Um, but Andy is a data science professor at multiple universities. He's the founder of Analytic Access, which provides educational and [00:01:00] consulting services and analytics. So, to healthcare and financial services. And Michael is an assistant professor of user experience and technical communication at Arizona State University.

And these two have a pretty unique seat from which to see what's happening in terms of data experience design and AI. And so we're really excited to have this round table today to talk about what's working. What really isn't and maybe what we can all learn to help ourselves be better prepared for what's coming.

Before we get into this, Dave, I think probably that we have some listeners who are unfamiliar with even the concept foundationally of data experience design. So can you take a minute to just ground us in what we mean when we talk about data experience design? 

Dave: Absolutely. So when we think about Data as a resource that [00:02:00] companies can deploy to support their customers.

We can. It kind of changes the way that we think about what it is that we're doing with data. Traditionally, companies have thought about data as something that is extracted from the customer and used by the company to better target a particular, uh, I don't know, a particular customer need or Solve a particular company problem, what data experience design as a methodology tries to address is if this is a resource that the company has, how do we use that resource to support the customer?

And we've done a lot of research around this. Why do customers share data? What do they hope to get out of the data that they share? What is most beneficial to [00:03:00] them? And there's kind of like four things that are key starting points for data experience design first. The company needs to know the job that the customer is trying to get done, uh, when they're sharing data, when they're using tools that have data in them, when they're trying to give feedback.

So there, there are certainly jobs that they're trying to get done. The company needs to know what those jobs are. Then the company needs number two. to be able to collect the data in such a way that the customer wants to share even more. So there are techniques that you can use that will encourage the customer to share more in a positive and meaningful way that aren't, that isn't manipulative.

And that's step two, that data that gets shared. The customer wants to be [00:04:00] used in most cases to support them at their point of use. So that's number three. What is their point of use? What is it that they're trying to accomplish? What's the use case that they're trying to accomplish? We collect all this data and then we use it for our purposes.

That's not... Is good. That's not good. More often than not, either for the company or for the customer, we need to use that data to support them and make their lives better. And when we do that, it takes us to the fourth step, which is we are able to begin to anticipate their needs, new needs that they might have new things that they're trying to solve for.

That leads us to better understand the jobs that they're trying to get done, additional jobs, identify those additional jobs, and it creates kind of a virtuous cycle that we can go through, and it's very different [00:05:00] than what I think a lot of companies out there are doing, where they're simply trying to collect data or use data that is.

They have to accomplish things that they hope they can accomplish. Uh, it's not tied to the job that the customer is trying to get done, not focused on the point of use that the customer is trying to address. So in a nutshell, that's what data experience design is. 

Aransas: I think of it as data to drive value for the customer.

Yep. I like that. Instead of value for the company, because we believe that value for the customer leads to value for the company. Exactly. That being an afterthought. So, Andy, uh, let's, let's turn to you here. You work in the, in data science. How do you think about data experience from [00:06:00] where you sit? 

Andy: I think about it in terms of a product.

I, a lot of times data is Historically viewed as a service. And the service is really ambiguous as to what the data professionals within an organization or consulting services do. So for me, moving from that data as a service to data as a product, it brings a lot of empathy and focus on who the end user is and what as Dave noted, what jobs they're trying to solve.

There's particular industries like healthcare, where this is a really complicated question. Um, and being able to identify what's the job, what's the supporting data and what's the best and most advantageous way to handle that data and present it to the, the right end user, um, is really critical for experience design it.

And as Dave noted, can drive benefits, um, [00:07:00] across all of the stakeholders. I 

Aransas: love that. It's a great summary right there. And from your seat, Michael, from a user experience standpoint, what's important to understand about data experience? 

Michael: Well, Andrea emphasized moving from service to product. And I think in UX, Our focus is more on moving from concept to solutions, especially when data get messy, and especially when user experiences are so varied.

So first of all, defining what data is, and then trying to get a grasp of what experiences we want to intentionally design. And I think that the four starting points that Dave mentioned earlier are a good way of whittling down how to define data and how to define experience within the context of a particular company.

But also within the context of driving value, as you said, Aransas for particular customers. 

Aransas: That's, that's so true. And so well said. [00:08:00] And if we, if we think now about what's working and what isn't, Dave made the comparison between meaningful and manipulative. So I'm interested to hear how each of you think about that Delta and.

What, what drives a sense of meaning for customers in terms of how we take and use their data? Or what drives a sense of, of manipulation, which is something that It is increasingly front of mind for our customers. We do quarterly research steps into perceptions on data and we talk to tens of thousands of customers every year about this and it is not a trend that is improving.

Unsurprisingly, we are becoming increasingly more reliant on technology, of course, and have been since [00:09:00] the advent of technology. That is not a trend that we will ever see change in our lifetimes. Um, but we are increasingly suspicious of it as well, while giving it greater and greater power over our lives.

Andy, let's go back to you. How are you thinking about this? I 

Andy: think about it constantly because of that end user experience, you need to have that credibility and the customer needs to believe that you're providing them accurate information, um, and in a way that's going to meet their job to be done. I think most of the manipulation that I've seen is inadvertent.

Um, you know, using. inaccurate or deceiving scales on charts or per presenting information in a very complicated way. We see that in health care where maybe complex statistics are being provided and with without interpretation or context and it really doesn't meet the job to be done, but it also fosters.

Distrust people don't like to feel dumb [00:10:00] or taken advantage of and when they're presented with information that they can't understand to really no fault of their own immediately that they, they distrust the whole message. Um, so being able to really understand what does the user need to feel and think and understand in order to meet that job to be done is, is something I put a lot of time and effort into.

And what 

Aransas: are you doing specifically? What does that look like to better meet those needs? What's happening? 

Andy: Treating it as a product. I spent a lot of time with my end users, um, and understanding what's their job to be done. And Really, um, trying to prototype data and data solutions in front of them and getting their immediate feedback.

Um, I use a lot of tools such as Figma where I can really quickly create, um, here's what your report or your KPI or whatever information we're looking for could look like, how do you feel about that? Does that [00:11:00] actually answer the job to be done? Um, how would you use this information before we then really move into the, the nuts and bolts of engineering that solution?

Um, so trying to spend a lot of time up front understanding my user and what they're trying to accomplish. Um, helps a lot with designing the solution effectively. 

Aransas: Yeah. And I, I suspect the same works for companies who are trying to meet the needs of their end users to really do that upfront work to understand how the data is used.

Um, because that's often the disconnect, right? We, we think, oh, well, I'm gonna provide the data, but if we're not providing it in a way that's contextually relevant, then it loses value for the customer. Michael, coming back to you, what do you believe is working in this space and what do you believe the biggest challenges are?

Michael: Well, I think Andy was much more generous than I am. He suggested that a lot of the [00:12:00] manipulations that he sees are accidental. A lot of the ones that I see in UX seem to be fully intentional. And there's been a lot of scholarship on dark patterns. And they've become kind of business as usual. This way is that we poke and we prod our users to giving up more data or to do things that they didn't intend to do.

And some examples are things like trick questions. Sneaking stuff into their shopping baskets, what's called a roach motel. Like if you've ever tried to cancel Amazon prime, it leads you into this massive maze and at the end it says, well, thank you for reaching the end of this maze, please call our customer service line to try to cancel.

But one thing we're also seeing is that a lot of the data that's being extracted from customers is not just from keystrokes, it's from audio. And a lot of times audio data or audio data collection is not being disclosed. you a quick example. So my wife and I like to have very deep conversations. And the other day we were discussing [00:13:00] something like the relative benefits of crunchy peanut butter versus creamy peanut butter.

You know, something just very mundane like that. But then the next day she started getting peanut butter ads on her phone in her social media services. She had no memory of searching peanut butter on Google or searching peanut butter on Facebook or LinkedIn or any other service. But somehow the phone or the apps or something had captured what she said and translated it into targeted advertising.

And so one of the key questions that a lot of companies are going to have to address as they continue to collect more data and more types of data is how to avoid what you could call privacy zuckering named after Mark Zuckerberg. Hopefully you don't get sued for me saying that. Who is kind of notorious for collecting way more data than users intentionally agree to.

And a lot of apps, a lot of technologies of all kinds do that as well. Not in, sometimes they tend to be more transparent [00:14:00] about the keystroke side of data and less transparent about things like. Listening to your voice or capturing the ambience of where you are, your geo location, for instance, which you might not be fully aware of.

And so on the user side of things, I think we need to have more data literacy and on the company side of things, I think there needs to be a resistance to just doing business as usual, because a lot of these arc patterns are already entrenched. And if this is what businesses are already doing without a clear ethical commitment for doing what's best to.

their end users, then things are not going to get better. 

Aransas: I keep thinking about that story about the robo vacuums that were sensing when they were moving boxes so that they would start targeting new advertising based on the, the moving boxes present in a house that there wasn't that. I'm like, that just feels [00:15:00] so invasive.

And Dave has done a lot of research to understand kind of what triggers creepy. Uh, so Dave, I want to pass it over to you to share some of what you're seeing in terms of customer expectations or, or. Creep triggers, 

Dave: you know, uh, it is, I love this idea of, uh, what did you call it? Privacy Zuckerberg or something like that.

I can't remember Zuckerberg. Yeah. So, uh, we went through a period in, from like 2014 through 2018 and it's still going to continue, I'm sure. But the biggest issue that, uh, consumers had with data was. Who are you going to share it with? And that's really, really creepy, uh, what you're doing. And I think regulations are starting to catch up a little bit and, and companies are being called on the mat, uh, [00:16:00] associated with it, but obviously it's still happening.

If, um. What you guys are describing is occurring, certainly in China. It's a different world. The government has full control over the data that's being shared there. So this idea of dark data sharing and dark paths is real. On the flip side of it, if I, I am in a particular location and I need help finding, uh, some place to eat.

I really do want to share that data with people because that is the job to be done and I need contextual data. I need, I need contextual inputs to help, right? I need to know the location. I need to know what the locals think about the. The food items, it would be nice if they know my preferences [00:17:00] because that might help, uh, narrow it down.

And there's, there's a lot of elements that I need, need to know. And so when the customer actually. Has a particular job that they're trying to get done. Let's let's take the different types of peanut butter if the Madsen family had decided to have a debate about this and it said Okay, how many people in the world actually like creamy peanut butter versus how many people in the world actually like crunchy peanut butter?

that might Um, lead to something, uh, where they would want as much contextual data shared as possible. Um, but it, but it wouldn't, uh, that's a totally different thing. And I, and I think that's the big difference. If the company is not using the data to actually support the customer at their point of use, and they don't know the job that the customer is trying to get done, then They [00:18:00] can collect a lot of data, but they will do it in a way that is detrimental, um, and ultimately will be detrimental to the business model.

Mm hmm. 

Aransas: Mm hmm. Yeah, and I think we are seeing more and more companies being punished for lack of data privacy and, and appreciating these customer expectations around this. And certainly there have been some shifts. and regulation around this. I think a lot of it is, uh, at least in North America, it's echoing what's happening in other, uh, other parts of the world.

Uh, Europe certainly has had a huge, huge influence on our regulations. So, um, Andy, passing it back to you, um, you know, you are working. In two areas, finance and healthcare, which perhaps are the, are the pieces of data of which we as [00:19:00] human beings are, feel most vulnerable and most protective of. And so how do you guide your clients to Uh, demonstrate a sense of respect and privacy.

You know, I do a bunch of, a bunch of, a lot of research with people in both of those fields. And it's the first question that comes up, can I trust you? So how do you get your clients to demonstrate that they are trustworthy? 

Andy: What I focus on a lot with, um, any communication that goes out to an end user, a, a, um, consumer.

is the 7th grade reading rule. If, if a 7th grader can't read it one time and understand what you're saying, then it's too complicated. So that's my first stress test regarding any communication that goes out. But the other aspect that I focus on particularly with healthcare So the healthcare industry as a whole is really bad at using [00:20:00] data, really good at collecting data, but because it's, it's such a industry with a robust history, most of that data until 2008 wasn't digitized.

It was mostly, uh, you know, paper documents when president Obama came in, um, and passed some regulations, then the great digitization of healthcare records. Um, started to happen, but the systems that healthcare uses are really focused on data entry rather than data use. So we've got these medical records, claims records that have tons of information, um, which on paper seems straightforward.

However, using it on the backend with the data systems and architecture behind it is incredibly difficult. So, one issue is that. We've collected all this information that Consumers are assuming we're using for their good, but we're not using it at all. Um, and that can breed [00:21:00] mistrust of why did you collect this?

Mm-hmm. in, in the first place. We may use it at point in time in a small scale, qualitative way, you know, between doctor and patient or claims representative. But in the larger scale though, the data is being stored, it's, it's hardly used at all. And, and that's something I'm focusing on right now. Mm. Mm-hmm.

Aransas: And so, how do, how do you improve that? Because I, I, I think that's a deep systemic issue. So I'm curious how you actually tackle creating any change there. 

Andy: Thinking of it as a product, it, for me, one of the largest bottlenecks is the concepts. Having a consistent definition of various concepts throughout healthcare, um, is the first step.

How do you define a readmission rate? How do you define a type 2 diabetes patient? Um, though they seem straightforward, it's, it can be a different definition within the same company [00:22:00] and across the industry. Um, so coming up with a common ontology is my first step, because once you can define these concepts and these definitions, then you can start to try to align, um, this complicated data up to those concepts, and then you can report out on those concepts.

But until that exists, um, you just have a massive amount of disparate data within an organization. and industry. 

Aransas: And so it's, it's also enables sharing it throughout the organization in meaningful ways, as opposed to just sort of letting it sit there and fester and become ripe for hacking and, and theft.

So how about you, Michael, how do you Think about fostering trust for customers. 

Michael: Sure. Well, I've also worked in health care like Andrew has. In fact, before I came to Arizona State University, I worked for the Medical University of South Carolina, which is a large comprehensive health [00:23:00] Sciences Center, and they did a lot of work in particular with the emergency department where we were where we were trying to tackle some of these big issues, and I think that eventually healthcare is going to have to move towards what you could call the democratization of data literacy because Design professionals are in such high demand and there's so much data, they can't possibly do it all.

Or sometimes, if they work on a consultancy basis, they may, they may lack the industry specific knowledge to really get down to the nuts and bolts, the fine details of the problem that an organization is facing. But I've met physicians and nurses and other healthcare providers who have learned just a little bit of data science or a little bit of data analytics.

to actually affect positive change. I'll give you a couple examples. One was of a physician who worked with intimate partner violence, which was being funded at the time by national organizations, and so she didn't have a [00:24:00] super strong background in statistics herself, and she didn't have a super strong background in computer science.

But she knew people who did, and by collaborating as part of a team, they were able to use natural language processing to go through thousands of medical records and review what are physicians writing about when someone comes in and says that they've experienced intimate partner violence, which is a tricky term because people use lots of different vocabulary to describe it, and they might.

Resist describing it at all until they really start to trust the healthcare professionals. And then based on their natural language processing, they compared their findings to the actual diagnostic criteria from their accrediting bodies to see where do we need. Updates and last I heard they were able to do.

So another example is of researchers out in Florida who are very concerned about the experience of HIV and living well with HIV [00:25:00] and Some of these patient stories are just kind of missed and the massive net that the healthcare industry that drags across all of their technologies. The personal experiences of health and illness are often left out unless they're specifically asked about.

And so through focus groups and interviews, they gathered a bunch of qualitative data, looked for trends, and then they developed personas that they're building into graphic novels to inform. People who are recently diagnosed with HIV, what the risks are, where to go for help, and in short, how to live well with HIV using data that is often missed again by that big net that the healthcare system drags through people's personal lives, but often leaves out things that are very important.

And so I think that in healthcare specifically, but in other industries more generally, we can foster a sense of trustworthiness by actually improving people's [00:26:00] lives. It's not abstract wins, but very concrete and specific and meaningful wins for users of all kinds. 

Aransas: It's funny because we've talked so much about the negative side of listening.

And what we're really talking about is deep listening with an intent to really hear and understand the needs versus the numbers. So Dave, passing it back to you now. What does this mean in your mind for the future of experience strategy? 

Dave: Well, we as experienced strategists, we really need to continue to think about data as not something that we just collect, but as something that we design, that we, um, create that it's not just.

The visual presentation of the data, it's designing how the data is collected, designing [00:27:00] how the data is governed, designing how the data is analyzed. All the way through, we need to be designing, uh, data in such a way that it actually supports the customers. And with the advent of artificial intelligence, There's even more that needs to go into the design.

One of the things that we haven't talked so much about in this particular podcast is that the customer expectation is, Hey, you collected this data. You should know me. And if you know me, you should be able to anticipate things for me. And so we would expect that as, um, data analytics gets more and more sophisticated, as artificial intelligence plays more and more of a role in, in our lives, uh, that things will be anticipated that we will be prepared for.

That if we've built [00:28:00] that trust in the right way, we're going to be able to, uh, accomplish far more because the intelligence that's associated with the solution. So I think that that's where the future is. I think that that's what experienced strategists should have been doing and they need to be doing more of going 

Aransas: forward.

Yeah, and I think as we look at data in terms of customer trends over the last three years, but especially the last year, Dave, we're seeing such an Um, increase an expectation of stability because we feel so destabilized across so many life domains from the impact on health, on finances, culturally, etc.

Over the last few years, I think people just hunger for some stability, balance, and a sense of psychological safety. We do. We give so much data and [00:29:00] information to technology. I think we do crave and hunger for it to give back to us. But just making this ride of life a little bit easier and less terrifying, a less like a less like a roller coaster, maybe, um, or at least not a quite a thrill ride.

Um, so I guess passing it back to Michael and Andy. Michael, we'll start with you. What do you hope experienced strategists consider for the future in order to set themselves up to be valuable to their customers over the next couple of 

Michael: years? Well, that's a big question. How much time do we have? I, well, I'm a teacher, and so I tend to think of big questions like that in terms of what can I teach my students, or what can, what are my students currently teaching me about experience strategy and where the future is going?

And there are certain technologies [00:30:00] or groups of technologies such as artificial intelligence that absolutely have to be reckoned with and healthcare and finance in online shopping that I think we still need to figure out how can we customize them so that they're not. don't seem so evil. What sort of ethical guidelines do we need surrounding them or what kinds of regulations do we need surrounding things like AI?

But I think that experienced strategists also have to be very nimble. They have to be very reflective and very empathetic so that they can upskill as the job market continues to shift as the technologies continue to evolve. But more than anything else. I hope that experienced strategists have a firm set of ethics and design values that they can go back to because the contexts are going to be evolving, perhaps very quickly in the future, but I hope that they can decide that they stand for elegant experiences or for socially just experiences [00:31:00] or something core that they can keep going back to to provide some guidance in the fog.

Aransas: Yes, I agree. And as we've seen. Edelman's Trust Barometer drop again this year, uh, in terms of global customer trust and with greater and greater transparency, I think we're all going to be held accountable for our decisions. And so, so often that we have done research in human behavior to understand what Uh, what drives people to behave in socially or societally, uh, advantageous or, or expected ways.

And it's usually the feeling of being watched. Um, so, you know, people will not put their dog's poop bags in the trash if they feel no one is watching. Um, but if they feel there's a chance somebody's paying attention, they will. And I think maybe the message you're sending to companies right now is, We're watching, [00:32:00] so uh, be ready to, to put the bag in the trash.

Andy. That was a leap on a metaphor there, sorry. Andy, how about you? Same big question here. 

Andy: I hope and I, I think the trend is that The data collection becomes part of the data product. As you noted, we've seen with regulations like CPRA that over collecting data that's not being used is actually becoming a liability to many companies who now have to be accountable for it, um, and may have to provide it or delete it, um, or restrict it based on, um, California residents.

So data creation and data usage really needs to fall under one consistent strategy, which it currently doesn't. Um, in general, uh, generally you have software engineering teams who are creating [00:33:00] data collection. Products and data engineers and analysts and data scientists on the back end are just trying to deal with whatever comes through the pipeline and, and try to create some kind of value.

And there's not a consistent strategy across the two groups and the end users get whatever they're able to throw together. Um, so I think viewing the whole system as one product would move everybody in the right direction. 

Aransas: Love that. Love that. For those of you listening, I hope you're taking from this some strong calls to action to collect data with the customer's needs in mind and, and truly at the forefront of these decisions because ultimately that's what all of this comes back to.

Dave, as we wrap up, any, any closing thoughts on this? 

Dave: You know, this is such an important issue and it is. been overlooked by companies, I think, for a long, long time. And as we move into [00:34:00] more and more companies embracing artificial intelligence, we have to get data experience design, right? We just have to do it.

Uh, otherwise there's going to be some real problems. So, uh, there's an urgency, I would just say, uh, for companies to start paying attention to what data experience design actually is. I agree, 

Aransas: agree , the time to wait is over. Uh, or the time to deprioritize it is over. Um, so again, for those of you listening, I hope you hear a strong call to action in this.

And if you want to learn more about data experience design, please head over to stone mantle.co where you can find lots of our writings on this topic. We'll also share links in show notes to the Madson Brothers work. so that you can continue learning with and from them. Thank you so much for listening.

Michael, Andy, thank you so much for spending this time [00:35:00] with us. Thank you. 

Michael: Thank you. 

Voiceover: Thank you for listening to the Experience Strategy Podcast. If you're having fun nerding out with us, please follow and share wherever you listen to your favorite podcasts. Find more episodes and continue the conversation with us at experiencestrategypodcast.com.

Previous
Previous

Experience Strategy Podcast Newsletter: The Dark Side of Data Experience Design

Next
Next

Breaking Down Silos: How Experience Strategists Can Bridge the Gap