Video

Chatting with Jeff Gothelf

Post by
 Chatting with Jeff Gothelf

Transcription

Hannah Bertiger  0:00  

 Cool. Okay, so I think just thinking about data, analytics and tools, there's, there's lots of solutions on the market. And a lot of companies already collect data on their products. But there could be overwhelmed by the data. So I was curious to see like, your thoughts on what types of functions and design teams and specific Have you seen that have been using data? 


Jeff Gothelf  0:31  

Well,yeah. So it's interesting, right? So I think that becomes the focus functions. You know, there are the folks who are who we expect to have and use the data, right? So the data scientists, the analysts, that type of thing, but the functions who have been using should be using data, well, product managers. So let's start with that. Right. So at the very at the very basic level, a product manager that is not using data is not doing their job, period, end of story. And it's interesting, because I keep coming across product managers who some don't even have any interest in the data, and most struggle to get the data. And that is something that to this day is still is still a challenge for the overwhelming majority I think of, of organizations that I work with. It's It's It's the rare organization that has that provides broad and clear access to data to everybody involved in the product development process. So we start with product managers, obviously, any designer have a certain level of experience,should be infinitely curious about how their designs are working, and how they're succeeding. And whether or not their ideas have actually had an impact on the user experience. And so those to me, like you could say, the functions, right, Product Manager, user experience, designer, that type of thing. But I think you've got to qualify those in some sense, right? Because the ones that do it well, are curious. So you could add that qualifier there. And they're humble. I think a lot of a lot of organizations as a whole teams, and individuals sometimes will avoid looking for data or using the analytics tools. Because it may prove them wrong. And that's terrifying to them, right? Because, well, I mean, from a personal perspective, and suppose I got hired to be a product manager or a designer or whatever it is, I'm supposed to know what I'm doing. And the data shows that I was wrong, am I going to get fired? Right? The other reason why they might be afraid of this as well is that their fee organization where they work, doesn't provide the psychological safety to be wrong. Right. So it's not okay to be wrong. And if I go check the data, and the data, conflicts with what I said my design was going to do, or this feature that we shipped last week was going to do, I don't want to bring that up, I don't want to get in trouble. I don't want to make the project sponsor look bad. Right. So the psychological safety net to do that is is a real issue. I mean, my biggest client right now is that suits a bit of a authoritarian command and control type of environment. We're trying to teach them to be different, but teams will go The last thing that their boss told them to launch. And they won't go check the stats, they won't check the analytics, because the measure of success is make the boss happy. Right. And so the teams that are doing this, well, not only have curious, humble product managers and designers, but they've got the psychological safety, to learn. So being wrong is learning in theory, right? And then to to explicitly talk about that in a way that then allows them to iterate and improve the product. And so this opens up more and more layers, but that's kind of the beginning of the answer to that question. 


Steven Cohn  4:23  

So I want to dig into two things you said, Did you have any follow up questions?


Hannah Bertiger  4:28  

I was gonna say it sounds like an overall theme that it doesn't matter if it's a specific team within the function. It's more of an organizational level or at a foundation that the culture needs to truly embrace that agile mindset, we could say, and truly demit data democratization to be successful.


Jeff Gothelf  4:45  

Yeah, yeah, absolutely. Yeah. And so if you're looking for like a, help me understand, if you're looking for more specifics, then then help me understand where, yeah, let me let me ping in. Let me ping into things. Um, let's go downstairs for about one second. Hey, so


Steven Cohn  5:44  

Okay, you started that answer with something to the effect. I'm paraphrasing. If you're in product management and you don't use data, you're not doing your job. Correct. Would you? Do you feel strongly about UX design? If you're in UX design, and you're not looking at data about your designs? Are you not doing your job? 


Jeff Gothelf  6:08  

Yes, you're not doing your job? Or no, you're not doing your job. Right. 


Steven Cohn  6:14  

So it's so it's not it's not something now. There's a nuance there. What if you're in UX design, and you don't directly look at your data, but you get a data sent to you from your product manager?


Jeff Gothelf  6:34  

Yeah, that's interesting, right? So I've worked with a lot of organizations where whether it's designers or engineers, where I've heard the phrase, I trust my product manager. And I, you know, I think there was a time where maybe that was okay. And I don't think that that I don't think we can. We can live like that. I don't think we can live in that way. Now. I think we've got a we've got I think it's a Bezos quote, I think we have to trust but verify. And I'm fairly certain that's a that's a Bezos quote. But the idea is that it's not it's not that you don't trust your product manager. In the sense they're not going to, in theory, they're not going to lie to you. Right. But I think that it is irresponsible for you to rely on us on a on a second party or third party to convey what's happening with the product, because inevitably, as soon as that data passes through a human, it's editorialized right? It's never, it's never going to be 100%. What it was right, the moment it passes through, we're human, it's going to be editorialized and they're going to they're going to translate it into things that they feel will make look good or more successful, or whatever it is. And so there's a democratization of access, democratization of the data that I think is required, ultimately, for teams to do to do good work. Right. So it's it's. And so that's, that's the key. And I think it's, I think, is a designer's responsibility to ask for that access. And to ask for that, you know, firsthand. And now, the ability to do that firsthand analysis. So the first hand review of the data, because they may see things that somebody else doesn't, right, that's the whole point of cross functional collaboration, right? And different points of view and diversity in opinion in that type of thing. Awesome.


Steven Cohn  8:31  

The second follow up question I had. What do you do if you're a designer who agrees with everything you just said, but you are in find yourself in an environment that doesn't support that type of data driven learning, and is more hierarchical and top down, but you in your core, want to use data and do it exactly as you're saying, like, you have that high motivation, but it's just how do you break through? Or how do you kind of be them, show leadership and bring that into the organization?


Jeff Gothelf  9:17  

It's tough. I think that if you're if you're a staff level, UX designer, you're not in a position of leadership. And the best weapon that you can weapon, the best tool, don't say weapon. The best the best tool that you can bring to this conversation is evidence. Right? And so if you are in a top down, you know, autocratic sort of build what I say design it the way that I saw it on in, you know, on Netflix type of thing. Then the best thing that you can do is get the data and then present the data with the work to then justify any changes that you would like to make or justify any decisions that you have already made, that you believe are improvements and what you were told to do. And that's not going to be foolproof solution. But I think it's the best tool that you have. Right? There's a lot of risks there. Right? The risk is that again, psychological safety, right? When you're being insubordinate, the risk is that you make your your boss look stupid in front of their boss, guess what, that's not going to go? Well. You know, the risk is like, that's not like, you know, like the organization I work with now. People are terrified of their bosses, right? Or it's just, it's not okay to challenge them. But I still think that if you're looking to make a change, if you're looking to drive some kind of change, the best case you can make for yourself, one is an evidence based one. And so if you have access to the data, your chances of influencing your chances of improving the situation of creating an objective conversation increased dramatically, because without that, right, it's you saying, you know, I like red and your boss saying, well, I, I like blue. Better. Right? And you're always going to lose that argument. Because you're outranked


Steven Cohn  11:30  

right? Let's let's, let's take it one level up, let's say you're not a staff, worker or designer, you're the head of design for a company. And you know, in your gut, you want to, you want to do this. But frankly, like historically, and you're maybe you just joined the company or whatever. But historically, the company has not had that kind of culture. But you're you lead the design team. And you they just hired you, they give you a you know, your whatever your or you got a really strong position there. Same same same suggestion,any tweaks?


Jeff Gothelf  12:09  

So look, I think as a new hire leader, you come in with a certain amount of, you know, new hire capital, that is yours to spend. It's not infinite, and eventually runs out. And so the question is, how do you choose to spend it? And how do you position that spend? So that was very abstract, let's get a bit more specific, right? Then that carry that metaphor, as far as it would go? By you come in, right. And your look, if you're hired into a leadership position, the expectation is that you are going to change things up, you're going to offer some new and better ways and you're going to lead a team towards better work. Okay, great. And so in those attempts to change the ways of working, the processes, the integration of design into the into the lexicon, or even the conversation of the organization, you will be more successful if you come to those conversations with data with evidence again, like before. And so before you spend all of that new hire capital in a way that basically says, Well, I'm the design leader, you hired me, so we're going to do things differently. Right? It's a far more compelling, and I've seen that too. I've worked in agencies with group creative directors who basically like, I'm the group creative director, and so we're doing it this way. And everybody's like, why is it because I like it. Right? And frankly, we don't have we don't have the luxury of that you have your experience and your expertise to the point the teams in a particular direction. That's great. Okay. Let's back that up with evidence. And I think that the more evidence that you can data, right, so that's, that's, that's what we're looking for here. You can put into those conversations, the longer runway you have for that new hire capital. Right, you can continue spending that more and more. Because without that justification, I think that capital runs out much more quickly, as well.


Hannah Bertiger  14:26  

And I think that kind of spurs into the one of the other questions we wanted to cover. And that would be kind of what are the barriers teams face on getting started to be are designed specifically on collecting this data and not necessarily relying on the pm and making data driven decisions or perhaps having these conversations and using it as evidence? And you know, what are the continuous roadblocks they might face as they continue to collect data with their partners to prove out their designs?


Jeff Gothelf  14:58  

Yeah, so I'll try to I try to These out in kind of order of severity. I don't know if I can do it off the cuff. But let's see. So it the most severe barrier is that the data doesn't exist. Right? Like we don't, we don't have it. Right. So to me, that's, that's the, the most extreme barrier to getting started on on collecting data, we didn't instrument the code, right? We didn't, we don't have an analytics tool plugged in. We just don't have it, which still happens today. It's absolutely, absolutely mind boggling to me. Right? So we don't have it, that's number one. Number two, and it breaks down into a variety of sort of sort of shades of number two is access. So we have the data, but it's buried in some in some SQL database somewhere. And so the only way to get it is you have to have somebody with access to the beta to the database, write SQL queries for you. So you got to find that person. And you've got to ask them, and basically, you're begging for favors because they've got other work to do. I've worked in situations where we had a dedicated Business Insights Team or, you know, a data team. But that data team, like everybody else has finite hours. And so they delegated X amount of hours, to various people and departments within the company. And so you had like, for example, the product team that I worked on, I can't remember the exact number, but we had like, four hours of bi time a month. And then once the, our budget was used up, we were done. Like we couldn't get any more data that month, like four hours of their time doing the research and giving us the insights that we need. And the CEO, of course, the CEO got like 50% of the team's time. And, you know, if we ran out, we have to go borrow another team's time. So that's another obstacle. Sometimes data comes back, and it's completely just unusable. Right? So so we pulled out all these statistics that we can't actually put them together into any kind of a meaningful flow that tells us what's happening. Sometimes we the tools that we use are too basic, right? So that you get to a point where you can't get the answers to your questions out of the data that you've already collected. Sometimes the tool is too complicated. So I used to work at a company called webtrends, a long, long time ago, an analytics company and our tool was ridiculously complicated to get any kind of useful, actionable data out of that system was a brutal process of navigating a bunch of screens and stringing together a bunch of queries to make it happen. Simply not being allowed access to the data, I think is a real issue, as well, oh, we can't back the data exists, but we can't get it. I can't we get I don't know, various reasons. I think I've seen that happen as well. I've seen teams been being granted access to canned reports. But the canned report doesn't provide any real value. trying to think of other barriers. That's just a getting at the data.


Sometimes, you know what, as far as making data driven decisions, I think a lot of teams struggle to prioritize which data they should pay attention to, as well as the team I'm working with right now. They just launched a product there. They're based in India. And so when I say some of these numbers, they're going to sound remarkable. But remember, there's 1.4 billion people in the country. So they just launched a product about a month ago, they have 8 million users. Right? Not bad for months long, right? 8 million users, and they're starting to get a bunch of inbound feedback, App Store, reviews, plate, Play Store reviews, emails, complaints, social media. So they're getting all that inbound, they're getting all this usage data. And now they're like, we have no idea like, with 8 million data points. Now what do we do? like where do we even start with this? Right? And so I think that's, that's one of the like, once they once you do get access to the data, one of the challenges is focusing the work like which which numbers do I actually care about? In which one should I pay attention to now, which is a conversation that most teams don't have? Because they're just like, hey, just give me everything. And it's and it's overwhelming. And now this team that I'm working with, they're dealing with significant prioritization issues. They're like, what do we work on next? Right? Like I have 350 people complaining about this one feature that doesn't exist in the product yet. Should I work on that? I was like, we've got 8 million users. 350 people matter? out of 8 million. It's worth discussing. Right. So those, those are the kinds of barriers I've seen. 


Steven Cohn  20:37  

Sorry, go ahead. Oh, I was gonna throw out a question. But go ahead, Stephen.


No, I mean, I, I think that this is so amazing. Like, I'm sitting here thinking, I'm literally thinking, I think this is this one call is gonna produce like, 15 blog posts, because they don't have to be, you know, seven page blog posts, they can be literally like, three paragraphs, two paragraphs, you know, one of the one of the top five barriers to getting, you know, getting that in, I mean, like one of those types of things. But, um, you just towards the end, You teased up something that I know is on Hannah's list. And I think it's a very important thing, I want to make sure we get to it, which is about measuring user experience. Picking the right data, I had a conversation with a design leader at a company one time and she she was like, you know, this is an awesome tool. It's dramatically easier for me to use than like Google Analytics, which we've tried, and we can't use, I can totally see how my design team would love to use this and find it so easy to use with your Chrome extension, all this stuff, like I think it's awesome. No, not having to do any tagging. And all that stuff is great. She said, but I have to be honest with you. I'm not even sure where to get started. Like, where do I how do I get started measuring user experience? Like what are important metrics? For me or like, you know, how do I know what good user experience is? Or what what successful user experiences? .


And, like, if you're hearing that, then to me, that that's a symptom of a much larger problem with that organization, frankly, which is that those teams don't have a clear sense of the strategic purpose of the work that they're doing. Right? Someone told them to build a thing. And they didn't ask why. And they didn't ask for who. And they just said, Okay, we'll build the thing. Because you quantify a good user experience, with outcomes with measurable changes, positive changes in the behavior of your users and your customers. That's how you quantify a positive user experience. That's how you quantify user experience, right? Did we positively impact the behavior of the people who are using the system? Are they more successful, whatever success means to the to the service, or the product of the system with a tool that you built for them? Right? And so if someone says, I don't even know where to start, then they're not even thinking. They're not even thinking strategically. They're thinking, hyper tactical, right? Like somebody told me to build this form. What does the forum do? Well, it collects data for a mortgage application. Okay, great. I built the form, how do I measure the user experience? it? Well, who's the customer? Right? What problem? are we solving for them? And if we solve it, well, what do we expect them to be doing differently, we expect form completion rates to go up, we expect inbound inquiries to the call center to go down, we expect maybe more, an increased number of mortgages to actually be approved. Because we're going to stop people from applying if they can't meet certain thresholds along the way, etc, etc, etc. you design the user experience, to do whatever it is that you want it to do, right. But when I hear something like that, the red flag for me is that this team doesn't know the strategy where their product fits in the strategy. They don't know their customer. And they they're focusing strictly on output on building stuff and not not why they're doing what they're doing.


Now I've I've interviewed over 200 companies for this company 200 potential customers. Yeah. And I find almost across the board. design teams never met like measure outcomes of their outputs like they never tie the impact of their design, work to metrics. Very few actually get any type of analytics, almost none. So rare exception that logs into an analytics tool. Most of them will get data from their product manager, but it's a much higher level of abstraction than my specific design that I delivered, or my specific epic, or, you know, whatever that we're working on.


And, and so, you know, I think that they, they struggle with what are the questions to ask when you just highlighted, you know, some questions about how to, you know, questions to ask about, you know, how, how they, you know, how you can determine what success looks like, I think, I think it might be as simple, you know, a simple like, here's like four or five questions to ask to determine what you should measure to determine good user experience, right? I mean, and if you can ask those questions, then you can, then you can, you can, you can figure out the right metrics,


Jeff Gothelf  26:11  

but think about it. Right. But they should be asking those questions at the beginning of the initiative. Right. We're preaching to the choir. I agree. They're not, but but the basic UX questions. They're basic, like, do your job questions? Right, like, who is the user? What problem? are we solving for them? Right? And then what is success? Like if we're successful? What do we want to see them doing differently? That to me? Like, those are fundamental questions that kicking off any kind of initiative? I understand that they don't happen. But to me, that's, that's, that's a good UX designer. Right? 


Yeah, they don't. They're not even asking those questions. 


Steven Cohn  26:56  

I think theyre asking the first two, I don't think they're asking the third. I think it's the third is where we're constantly there. They're stopping. And they're just outsourcing that to the assumption that product has done their job. Yeah. And if product tells me this is important, and it'll move metrics, I'll just trust that it's important and move metrics. And I know the user is I know what problem we're solving. Like, I get that. But the What does success look like? And so how do you? If we want to promote? What does success look like? What is user experience success look like? What would you if you were leading a team? How would you help them think through that part?


Jeff Gothelf  27:48  

This happens all the time. Right? So in fact, it's, it's amazing. Every time I I start with a new client, do a little discovery work in one of my first questions out of the gate is how do you measure success? And you know, what they always say 100% of the time 100% of the time, the response to that question is? That's a really good question. 100% without fail, I can I can mount it, as they say, the same time, because they don't think about it, right. They rarely think about. And so for me, that's, that's it's about setting success criteria at the beginning. And then and then shifting the conversation away from output outcomes, right? Because outcomes are customer driven, right? What do we want our customers to be doing differently? If we're doing a good job, right. And doing a good job means we've built a great user experience, right? We've built the appropriate user experience for this target audience, but the problem that we're solving for them, and that's quantifiable, it's measurable, because today there has a baseline board because they're doing something today. Right? Whether it's you or somebody else, they're doing something today to deal with that problem. Right? So as you build and evolve, and iterate and improve that user experience, in theory, they should be doing more of that with you more successful. And that's quantifiable, that's measurable. That's the user experience. And that's the conversation that you have to have with an organization with a team with a project of the designer, whatever it is at the beginning, and I think that is a designer's responsibility to ask them, of their product manager when a product when a project is being kicked off, right when you've got a new initiative that's being kicked off. If that's missing from the conversation. Your hand goes up. Right? How do we measure success with this? right because, again, I see it every literally every day. They're like, here's the backlog for this week and the backlogs. I was like Here's the 15 things we have to build over the next sprint. And it's 50 things after that, and 50 things after that. It's like, why? Right? How do you know this is the right thing to do? Right. And I think without the evidence without the data, it's easy. Default as well. It's the you know, the hippo, right? The highest paid person's opinion. Right? with data, you can at least contest them or offer up an alternative tax course. 


Steven Cohn  30:34  

So let me let me make the implicit explicit. Why, why should a UX designer ask that question about what makes this design successful? Why should a UX designer not just say, you know what, the product manager tells me this is important. Executives bought off on the on the roadmap, he gave me the spec, I know the user, I know the problem solving? Why is it not good enough for me just to do the design, and go go about my day?


Jeff Gothelf  31:11  

Right? That extra work? So I think it's irresponsible? First of all, I think it's irresponsible. As a designer, I don't think you're doing your job responsibly. We look, we can go down an ethics and ethics path for a while as well. But I gave a talk, called sensing respond continuously optimizing our ways to better outcomes, and is learning our ways to better outcomes. And the talk basically speaks to exactly this. Right? Is this this sense of like? Well, you told me I have to drive engagement. Okay, why I drove up engagement? Yeah, well, you radicalized a bunch of angry, young, white boys into being Nazis, because the YouTube algorithm that, you know, incentivized engagement and served up engagement, serve up new related videos in a compelling way. Got them watching hundreds of minutes of white nationalist videos on YouTube. Right? So it's irresponsible for you to blindly say, okay, we're just going to optimize this without looking more broadly at the impact that you're having on the customers that you're serving. The general welfare of the public. And look, I've got story after story after story like this, you want to, you want to take it down, kind of an ethics path, as well. But it's, it's it's just irresponsible. And it's unnecessary. At this point, right? Like there's there's no reason, like maybe maybe maybe years ago, decades ago, it was impossible to get this data are really difficult. Then Then, maybe you could be excused for not doing it. But as you're excused today, and we've got the tools and the capabilities to make this accessible, readable, and you should if you can't, you need to be asking the question, right, it's really, really easy to optimize for the 80% case. But it's the 20% case where it really breaks down. In some cases, it breaks down really badly. 


Steven Cohn  33:25  

Right, so what about career?What about impact on career? Do you think it has any impact on your career, both current job promotion, or what, or whatever, or future jobs or future career for a UX designer to be? To measure the outcomes of their work?


Jeff Gothelf  33:49  

Look, I think that this is the, you know, in the job descriptions that I read today. There's an X. It's interesting, right? There's, it's almost inevitable that you're going to read a job description for a product manager for designer, whatever, that does not have data informed decision making. Usually it's a data driven, you can get into that conversation. Right. But rarely is that not going to be in the in the job description. The question is, is can you exercise, right? So I think you need to be able to speak to it. I think you need to be able to prove that you've got experience doing this, that you care about it that you understand what it's for. And so from a career progression, I think it's absolutely critical. Right? The real question comes when the rubber hits the road, it's like, Okay, I got the job. Okay, actually exercise this muscle that I told you that I have. But that's, again, that's kind of a different part of this conversation. But I think it's absolutely critical. like for you to come for you For you to go to a job interview today as a UX designer. So yeah, we shipped it and had to go I don't know. Getting the job. Like, that's to me like that's, that's that's an unacceptable answer. As a colleague as a hiring manager, the answer I have no idea is unacceptable. 


Steven Cohn  35:25  

Yeah, sorry. Yeah, that's really interesting. Um, I'm wondering if, like, you know, have thinking back to the designers that I've hired. And I'm wondering if, if that should have just been a, a, like a disqualifying question almost, you know, like, just talk to me about the, the outcomes of this design project that's on your portfolio?


Oh, well, you know, I don't know about this thing. And I'm like, Okay, well, this is not gonna work out, you know. interests. Interesting. Hannah, you have any other questions or thoughts?


Hannah Bertiger  36:00  

Yeah, it's, it's super interesting. And I see this within my own team as well, kind of everything you're saying. And I also notice it varies between levels of like, Who's entry level designer and more seasoned to ask these questions. But I was also curious, like, does, you know, so obviously, there's this like, fundamental question of just asking why those five basic tenants, let's say that you mentioned, but is there any nuance or variation by industry? Or like standardize across like, everyone should just be asking? Why? Um, like, when we're thinking about data and metrics for UX, right, we want to know, who's our customer? What's the problem we're solving for? What does success look like? But is there any nuance with what does success look like? Or how do you measure good UX? in various industries? So if you're thinking about e commerce or financial services, your SaaS, b2b etc? Is there any nuance that that changes there?


Jeff Gothelf  37:03  

Yeah. Okay. As you know, I think it gets more complicated. As you move away from standard B to C, type of experiences, right? So be b2b, it'd be still like, if it's kind of like straight ahead, b2b, you've still got sort of direct human conduct sort of human interaction with lashing back in a b2b situation, you're always going to deal with at least two personas, right? The chooser and the user, right? And rarely Are they the same person. So you've got to start looking at analytics, to tell a story that's compelling to the chooser. And then you've got to tell you got to tell a compelling story for the user, as well as I think that that's interesting. I think if you're doing you know, as the situation gets more and more complex, like a b2b to see type of environment, right? Now, you've got multiple users in the workflow here, you've really got to get get an understanding of how, what you're measuring, for each, what matters for each and how to communicate that across that life cycles are gonna get somebody who's buying it, somebody who's using it, and somebody who's selling it to somebody else, and they're using it. And so there's a lot of data here to, to parse, and then deliver in a compelling way to, to those folks. So I think I think it does get more complex, certainly, as you move away from from BC. Suppose the complexity of the task at hand could make it more difficult, right? If there's if there's a super complex task that someone's got to complete, online, that would make things a bit more more challenging. I don't know. I mean, industry is interesting.


I mean, if you think about it, trying to think of like, the comparison of like ecommerce and financial services, right. I go buy something on Amazon, I guess, find it, choose it, add it to cart, fill out a form and and check out if I'm applying for a mortgage on my bank site. I gotta find a product, I got to choose it, I got to fill out a form. And I got to submit it. That's everything. Now obviously, there's different you want to set expectations differently, the processes are going to be longer and shorter and that type of thing. Like I said, I think it gets more complicated as you move. I think more than industry. I think it's more like sort of the differences sort of like b2c b2b, b2b, etc, etc. That's everything. So I think that's where the nuances maybe felt a little bit


Hannah Bertiger  39:56  

interesting. I was also curious in Kind of circling back and digging a little bit more into this. So we kind of talked about, like, culture, like fundamental questions that you ask, needs to ask, and how that can set them up for success and how they should really be thinking about analytics and data. But, you know, aside from whether they have too much data, don't know how to read it, or don't have access to the data and kind of need to figure that out as an organization. Are there other gaps, you commonly see where design teams can can struggle or perhaps, you know, new hires or new leaders within the design space?


Jeff Gothelf

Okay. I think, I think it comes down to culture, um, and, and onboarding. And some level of autonomy assigned to the team and to the individual members of the team. So what do I mean by that? Um, I mean that if I'm, if I'm brand new to an organization and I joined that organization, um, as I'm being onboarded, one of the things that Uh, I should be onboarded onto is the way that we collect and analyze and report on usage data of the products and services. The digital processes is that we build. And what do we do with that data? And how do we use that data? That should be part of my onboarding process. Um, my guess is that it's not, you know, it's, it's been a while since I've gotten an in house job.

Um, but, uh, you know, I can't recall ever having that be part of my onboarding process. I don't think it happens. Right. But I think, or, well, I don't want to speak in absolutes. I don't think it happens very often. That's put it down. Right. And I think the organizations where it does happen are those unique organizations that are true learning organizations that are led by humble leadership teams. And who truly embrace this kind of creative of creativity and innovation, right. Which is the only way to stuff happens. It's like, we're talking about analytics, but it's the analytics that it's, it's the, it's the culture that enables the collection use and, and, um, impact of analytics that, uh, that really makes a difference in the success or failure of the organization, but, but that's like the Canary in the coal mine. Right. So if you come in an organization, you're like, what do you guys do for analytics? We have Google analytics. Like you pay for it in that suite and get that as a free product. Like that's, that's a red flag for me.

That'd be a red slide, literally. And so, and so to me, that that could be a really interesting angle. Like if you're job hunting, Right. And you truly want to find an ad's a learning customer centric organization. That's a really nice sort of, again, kind of a Canary in the coal mine, a question to throw in during your job, your job.


Right. What do you guys do for analytics? Use that data. When's the last time you made a, you know, a product decision. Using data is, this is a, this is a blog post right here. Questions, interview questions. You can ask to understand companies. Yes. They're a learning organization, right? Or yeah, customer centric organization.


Um, absolutely. Absolutely. Like, it becomes a really interesting, uh, way to empower the designer, right. Or the product manager to find organizations where they will be more likely to, to thrive. And to succeed and, you know, and, and frankly, you can ask that question now, usually when you interview for a job interview with more than one person, ask that question every, since every single person, cause you probably get five different answers.

Steven Cohn

Yes. Yeah. Yeah. One of the things I've found is, uh, very few people outside of marketing. Actually use Google analytics. Most of the people who say they have the one X, then you ask them, like, how often do you log in and use it? Well, you know, not that much. Do you use like you tag specific individual things? Nah, not that much. And you know, um, uh, we, we need to write a blog post about like why Google analytics is a marketing tool, not a product. Well, you know, product design, product development tool, because it's just, it's just not, and there's just so much friction involved in using that product for the wrong reasons. And I always tell people, I said, listen, Google created this analytical tool and gives it away for free. Not because Google is this like altruistic company. Right. Do it to sell Google ads. That's the point? It's a marketing, it's a tool for marketers to buy more Google ads. It's not a tool for product management or product design. Um, and so yes, I get that. It's free. From a software license perspective, but your time is worth much more than the license fees and it's not free from that perspective because it's a tremendous burden to actually use, use the product.


Yeah. And so for a landing page, it's not a big deal. Right. Cause you've got two or three buttons on landing page. No big deal. You throw a lot, you throw some tags, whatever that's marketing. Cool. You know, for an email campaign, it's one tag you got to create, not a big deal per button or whatever. Um, for product management and for design I mean, it's just not designed for that. And, uh, it's a disaster. 

Awesome. We're basically near our time here. Um, any last things Hannah or, I mean, this is from my perspective far surpassed my hopes. I think we've got, could easily turn this into, you know, 10, 15, maybe 20, you know, blog posts, just, you know, just like. Like I said, you know, three things to look for when, when interviewing with a company, you know, five things to, you know, ask, you know, you know, when you're interviewing a new design, I mean, just like how to, how to, what are the three most important questions when determining UX success? You know, like stuff like that There's just tons of content in here. 

Hannah Bertiger

Yeah, I was going to say, there's, there's a lot of great nuggets, I think just out of curiosity, to pick your brain. And you know, I also come from a product background that's heavily focused on HR. And some of the things you talked about was really interesting. So when I onboarded for my job, I actually onboarded through jet.com where I was hired And they talked a lot about these through our onboarding team. Very much set up like a product team to implement these processes and talk about our customer on the Walmart side, it's different. Um, but I was curious to know, as you talk about it, in order to kind of remove that psychological risk or fear to truly enable data democratization, have you ever like recommended teams almost partnering with HR to help implement some of these organizational changes, because sometimes there are sometimes that the people who might have more of a voice within an organization to help with that. I, um, so yes, but not explicitly for analytics. 

Jeff Gothelf

Right. So, um, I have become increasingly more, uh, Aware and involved in the power that HR has in impacting how an organization works Um, inevitably as you kind of work your way, as you kind of peel back the layers of the onion, as you try to do cultural change and ways of working and process change, you end up, you end up in two offices, you end up in the finance office and you ended up in the HR office. Right. Um, and so, um, Yeah, it's interesting. Cause in the HR office, what we're trying to get them to understand is that if they can change the incentives, right? So performance management criteria, um, how you measure me as an employee, right. And how you reward me and incentivize me as an employee. It fundamentally changes what I do at work, right?

So if you reward me for heroism, individual heroism and delivery, right. Attendance. Then I'll come to work every day and I'll, you know, I'll be the hero. I won't collaborate and I'll ship as much stuff as I can. Right. And that's generally speaking animals, organizations, uh, measure and reward them. But if you change that towards collaboration and learning and customer centricity metrics of whatever and any behaviors, and that's actually things, then what you, what you're doing is you are explicitly telling me to go and learn whether or not I'm being successful and you're making it safe and okay. For me to do that. And in fact, you're rewarding me for learning, right? If you're rewarding you for learning, I have to go learn stuff. And part of that learning is getting that data, right. So if we don't have the tools for it, I'm going to raise my hand, say, listen, you're measuring me on understanding the customer's experience in the product.

And we don't have analytics there. I have no idea what they're doing. I can talk to a few of them every week. Right. Can get a sense of why they're doing certain things, but I don't have the quantitative side of the equation. Right. So, so yes, the short back is a very long answer to the questions, just say yes.

Hannah Bertiger

Awesome. Thanks. Yeah. I think that's super interesting and something for people to think about as they interview as well.