Steven Cohn 0:01
Hello, I am joined here today with my good friend Becky buck. Hi, Becky.
Becky Buck 0:06
Steven Cohn 0:07
how are you? And of course, Hannah. Becky, before we get started, why don't we share your kind of your background and your experience and, you know, building great products?
Becky Buck 0:20
Sure. So I've been working in product design and applied research for product design. For over 15 years, my professional training was actually in 3d three dimensional product design. So things like CAD modeling making stuff, and very early on realized that I was actually far more interested in all of the rationale and prioritization of how stuff gets made, there's a million decisions and how stuff gets made. And that continues to fascinate me even almost two decades later. So that early experience took me into innovation consulting, where we were largely doing service design for Fortune 500 companies. And in working in that space for years doing just dreamy, dreamy projects around helping companies making cancer drugs have more effective drug rollouts, and helping high utilizer programs and healthcare create better service for their patients. In the course of that work, I realized how little I knew about digital design, and how all of the technologies that empower our world work. It didn't know anything about databases, and what would be easy to do in three dimension and digital design. So that led me a few years ago to the Bay Area to learn what I didn't know about digital product development and how these systems work. So I started working at Salesforce and grew into research and strategy role there, helping early stage projects get off the ground, and teams build language around the work that they were working on and build frameworks for decision making and prioritization. And lead research for their internet of things product. And after a few years of working with awesome folks at Salesforce, moved over to AWS, and led a special projects team there have combined researchers and designers and did a stint in startups working at a local FinTech viral bank, the first of the fintechs to actually get a banking charter. And these days, I run my own agency. So my business partner, Scott, and I help companies solve business problems through research and design. So I've really seen the whole gamut of done this work from the perspective of small lean startups, big organizations, everything in between, what's your What's your name, your
Steven Cohn 2:54
What's your name, for your agency for studio, it's forged?
That is right.
Becky Buck 3:03
URL structure. So you can say forges a team forges working on.
Steven Cohn 3:07
I like that. I like that. So, alright, so you come to the Bay Area, Salesforce, AWS and a hot FinTech startup. It's pretty good pedigree, right? They're all in kind of, would you say individual contributor leadership roles, what, you know,
Becky Buck 3:25
It’s a mix, usually more leadership, but I'm one of those people that is very reluctantly, letting go of the idea of the player coach, I love folks who can get their hands dirty, and you know, provide that leadership. Well, staying pretty close to the work. Right?
Steven Cohn 3:44
I totally agree. So, so talk to me. So let's let's I mean to those are three great example case studies, that I think people will be very interested in either Salesforce or Amazon AWS, or kind of a hot FinTech in varo. Maybe you could pick one or a mixture of each, the topic that I want to go deep on is, you know, being an outcome focused organization. So product development organization, how did you see either Amazon or Salesforce or varo set up their structure, collaborate, cross functionally or not? And you try to be you know, an outcome focused organization. It's something that people talk about a lot. But if we're being candid, I'm not sure everyone really is outcome focused. So what does that mean? And how did you How would that mean it Salesforce, would that mean that Amazon would have been a borrow? And, you know, how did you guys work towards that?
Becky Buck 4:45
It's so hard, right? Because you suddenly have to define, you know, which outcomes and at what level and how do you start to align folks? So honestly, Salesforce, I think does a pretty amazing job. I know they've written case studies about their switch from their original engineering process to agile, but I wish they actually shared more about how they run their three our our design reviews and their sales kickoffs, because they're a company that I think is actually implemented. Okay, ours pretty well. They call them v2 moms. And there's tons of literature on what that means. They're basically okrs. But one of the things that they'll do in the sales kickoff, Benioff actually says, you know, executive a, how does this priority sit against executive be your priority three, and makes people actually be accountable in public and say, like, yeah, I get that my, my number three priority is lower. So I'm not going to ask for resources until their priority one is covered. Because that's so much of where the problems occur, right, is, how do you make trade offs and balance when goals are in conflict? And it doesn't seem like they're in conflict, but the reality is, when you start trying to plan the resources to accomplish these things, that's where you know the devils in the details and things get tricky.
Steven Cohn 6:13
Okay, so let's go into that. So, so you have so each organizational product lead sets, these Okay, ours are sets these k Rs, maybe they get the O's, right. They set the K Rs. And then they have a public discussion of trade off for trade off of resources, because it is development a shared resource or design a shared resource there? Or is it allocated to the team?
Becky Buck 6:41
Goals were public within the company, so they have a great culture of broadcasting. for lots of folks internally those discussions. I wish it was more transparent to the public to the community because they do a great job at it. Um, but yeah, that's, and this was, you know, years ago, now, I'm sure their work structure has changed a handful of times. It's like I joined Salesforce from the UX team was something like 60, or 70. And I left when it was at like three or 400. And I'm sure it's much larger now.
Steven Cohn 7:15
Okay, okay. So then, so our design, is design, a shared resource that kind of gets pulled from and development a shared or is it? Do you have like, pause? Or how does that how's that organized at Salesforce,
Becky Buck 7:29
um, again, like, it's shifted so much, that's probably not the best example for me to talk about where it started as a shared resources and then became non shared and like, okay, you know, go through the whole example of like them, the adults come in and look at the budget and how things are tracked and allocated, and we're like, oh, God, we must fix everything.
Steven Cohn 7:46
Okay. Okay. So, so, um, and then, so they set the Okay, ours, they have this kind of public, internally public dialogue around resource allocation, what's more important, etc, they come to a plan, and then how do they track success?
Becky Buck 8:06
That's the magic, right, is trying to say, you know, success for this head of a business line is this revenue metric, and success for their direct reports is, you know, coming in on time on budget and success for the people supporting them. So you get that sort of ripple down effect of trying to contextualize how do we organize groups of people, you know, hundreds, sometimes 1000s of people around a shared goal and help each one of them find meaning. It's, it's hard to do, right? And that's why I'd say, it's certainly easier and probably more relevant for us to talk about little smaller work teams, and like, folks managing roadmaps of, you know, maybe 2030 engineers rather than 1000s. Because I feel like it's so few people that are even in a position that they can be making decisions about that.
Steven Cohn 9:01
Okay. So, um, do you have is that mean that you want to transition over to a different, a different case? Yeah, I'm
Becky Buck 9:10
thinking we're in the the mindset of like, some of the startups that my business my current business fortune works with are things like viral bank, where, you know, they had probably a better example of pods, where in that context, design started out as a shared resource. And there were two designers serving like 7pm. And I think they've even transition now into more allocating designers within Scrum teams.
Steven Cohn 9:40
Okay. Okay. So you have some tell, so tell me a little bit about that. So what what would a typical scrum team at at varro look like?
Becky Buck 9:48
Um, five, six engineers, one designer, Product Manager, pretty standard stuff.
Steven Cohn 9:54
Got it. And so they would get together they would review their Kind of like, features? And then would they set goals? Or would How would they would they determine success? They look success for this feature is x, or, or not?
Becky Buck 10:13
Absolutely. And that's always, one of the things that I love about startups is the messy beginning. And that process of becoming, because there's always, you know, the intended approach, and the reality of pressures and conflicting ideas and norms. So it's another way of saying, we were just getting into the the discipline of creating okrs. And we got to go through that fun kind of exercise of saying, here's what we need to do, because we have this investment call coming up. And we really need to think through this problem and be able to show our investors that this is a thing we can accomplish, more towards, wow, our customer base has grown hugely. And now we're hearing all of these complaints about this particular bug or this, you know, feature, how do we start to now map that into the roadmap and make sure that we're, we're accounting for this, like, shoot, we have customers, right, we have customers? How do we deal with this?
Steven Cohn 11:11
Right? Okay, so, um, so, when, when you're in your when, when varro would be in there kind of their sprint planning or something like that? It's okay, we're gonna work on this. Maybe it's an epic, right? Did you have met? Did you guys have metrics? Or even for your clients and forge? You see, they have, you know, say, okay, for this epic, like success is achieving this metric? Well, that's called an engagement metric, or, you know, task success metric or whatever it is, right? adoption metric, is there, is there specific metrics that they set? And then they track towards? Or is it still all just kind of high level? You know, revenue, overall engagement, overall adoption rate? Maybe here? NPS? Which is a very touchy subject, we're not PSA. But, you know, is there stuff like that? You know, is kind of high level or do you do they bring it down to, to the individual pods, and then also the individual epics or sprints,
Becky Buck 12:19
we were just successfully bringing it into individual pods and individual sprints. So there were definitely engagement metrics feature by feature. And I think that's some of the fun nuances. You know, one metric does not apply everywhere, right, especially and complex tools, you know, the, our worlds are getting more and more complicated. So you have different user segments that are going to interact with the product in different ways. And different features might have naturally different usage patterns. But you can't say, you know, a feature is unsuccessful in the last half of our customers are using it. Because some features might be intended to only be used in emergency situations, right. So like, in that case, you know, the most banks now have that cool feature where you can turn off the ability to use a debit card, because you think you might have lost it, but it's probably in your car, but you don't want to check right now. And how do you just switch that button for confidence? Right? You know, that is a feature that shouldn't be used very often. But it's still important that it's there. Right? So you wouldn't want to gauge the value of that feature? by its usage alone?
Steven Cohn 13:24
Sure. Of course. Yeah. Yeah.
I think interesting. example, someone told me a Google, one of the things that they try to do is actually have you spend, at least the search team at Google spend the least amount of time on on the search, right? I mean, they every company's you know, most companies are like, you know, we want time on site or, you know, overall clicks or or engagements, their goal is to actually give you the most relevant result as quickly as possible and get you off their site as quickly as possible that success for them. It's like the opposite of pretty interesting. What have you seen been the biggest challenges to teams? Yeah, like I always had this conversation with a lot of teams and, you know, saying we want to be outcome focused outcome focus, that's like saying, you know, you like puppies, right? It's not controversial, right. You know, motherhood and apple pie, what's not what's not to like? Right. So but you know, there's always there's a, there's, there's a roadblock between I want to be an outcome focused organization, to actually being an outcome focused organization. Those are not those are two different things. And what have you know, you've had the luxury of working at some really, really, you know, highly respected organization, Salesforce and Amazon, really respected, you know, hot FinTech and borrow, and then, you know, through your agency and consulting work with a lot of other companies, you know, what's been your, your experience, kind of, what's that? What's the barrier, what's the biggest kind of thing that's stopping organizations from truly not just lip service but truly being outcome focused organizations.
Becky Buck 15:01
The first barrier is always access to data. People can often think through the metrics that they'd love to have, and it's such a hurdle to go and find out. Is that information we're actually collecting? Is it information we can trust? Because we're collecting it accurately? And how do we make decisions based on the information that we have. And that's it's fun of like in my current consulting practice, that's usually one of the tough conversations that we have with clients at the beginning is they know, I need to ship this new website by x date, or I have a product launch coming at, you know why date, success is not screwing that up. And we're excited to even consider it definitely part of our responsibility to coach them through more outcomes, focused metrics. Because it's the best way to quantify and help advocate for the business impact that design and research can have, like, we know that we're going to help them meet that launch date. But the really exciting part is when we can help that meet meet that launch date. And it increases conversion on the signup flow by 30%. Right, right, because nobody gets celebrated for hitting that launch date. That's expected, right. But when you can tie it back to those business metrics of how, you know, graphic designers, you know, qualitative researchers are actually having that level of impact on the business. That's when things get fun. That's when you know, headcount grows when you get more timelines, more budget, to do the really fun work of UX.
Steven Cohn 16:44
Right? So access access to data, why do you think access to data is is like? Is it cultural? Is it tooling? Is it is it interests, like people, you think people just don't care? They just want to show up, check in, right, you know, check out their code, write their code, check it back in, you know, it doesn't have any bugs in their, you know, unit test, they're happy? Or like, what what do you think the reason is they don't have, you know, they don’t care?
Becky Buck 17:11
Definitely not a lack of caring. Like, I'm, that's one of the luxuries of, you know, working in a city have, you know, a lot of high performance people and companies that are, you know, I've never worked with done colleagues, lazy colleagues, like, folks are very smart, very intrinsically motivated, and really passionate about serving customers. I agree. So the flip side of that, I'd say it's, you know, the initial challenge is always budget, how do I get budget for tooling? How do I make the case for spending engineering time on instrumentation? How do I even figure out who to ask, depending on how far away in the business I am from the people who can do that? How do I even navigate that challenge to figure out what we have access to today? How do I get a login for whatever tool we're using for for instrumentation? And then once I get there, oh, my God, how do I make sense of any gibberish that's there, right. It's all in like a coded language. And it often is just a significant onboarding, to even figure out what someone's looking at. And, you know, in pressure cooker cultures where, you know, folks are often trapped by in a number of tickets that they're delivering on, engineers who could be helping with that translation, often have very realistic pressure not to because they're tasked and allocated to delivering on something else. So there's a huge skill shortage for sure, of people who can self service and go into data usage tools and find those answers for themselves. And the industry definitely needs more people in UX backgrounds, interested and feeling empowered to go do that. But there is a huge learning curve. You know, and I've had the luxury thankfully, of, you know, having teams and colleagues who helped me with that, because it is really intimidating of like, I don't even like using those tools. I don't consider myself great at data science by any means. And I've been lucky enough to have had resources to hire statisticians who can go in and help you with some of that stuff and walk me through what the heck am I actually even looking at an amplitude? And how do we take this data that I'm looking at that's so disconnected from the product that I know and translate? What does that actually mean? Like right there? I can see it says there's a 13% drop off. What am I even looking at? What page is that on? How How does that number relate to the world that I understand?
Steven Cohn 19:58
Um, lets talk about culture. You know, you have the Forge, you have the benefit of working with I assume a lot of founders or, you know, you know, senior executives. You know, how would you advise, you know, a cultural like, what cultural advice would you give a founder comes to us like Becky, you know, really would I really want my organization to be much more outcome focused, I don't want them just shipping code and stories and moving on to the next thing. How do we, you know, how, how do we do that? So tooling is definitely a challenge. But what else?
Becky Buck 20:37
I encourage everyone to think about data as triangulating truth, right, any one source of data probably gives you an incomplete picture and lies a little. So like we had done a pretty amazing study back at Salesforce, it was one of the best sort of outcomes towards us that I had, where we knew anecdotally, that admins would often give their login to consultants to do hard stuff for them. So if there was advanced configuration, or sort of advanced automation work to be done, they would give their login to someone else. So it looked like the admin was logging in and doing it, but it was actually a very technical, highly skilled person. And because, you know, Salesforce invests in research and has an amazing UX our team, we were able to measure that and actually show the percentage of admins doing that, by triangulating data. And it was something where if you had looked at usage data alone, there was no way to tell, but by triangulating that with survey data with qualitative feedback, we were able to get a pretty good picture. And I think that's always the challenge is, how do you bring those different data sets together? And how do you know what data you can trust? So in terms of culture of like, I've seen, john Cutler has a great phrase of like, just start measuring, that is for sure the place to start, like, spend money on it from the beginning. And just get started. Because you can learn over time where the gaps are, and what you might not be seeing and might need to fill in through other data channels.
Steven Cohn 22:18
Okay, so step one, just start measuring, like, you just don't have to know, you don't have to know exactly what you're going to do and how you're going to do everything fully baked, before you even just get started just tracking stuff. Right. And then, you know, kind of go from there. Um, what about incentives? What about how would you advise around incentive team incentives? and incentives, by the way, don't have to be financial, you know, how, from a cultural perspective, how I'm a CEO, I want to drive an outcome focused organization, you know, can we thoughts on on incentives? Since
Becky Buck 23:03
I find some of the most powerful incentives are showing how you're helping users, people want to find meaning in their work, you know, like, assuming that you're working at a tech company, your salaries fine. And more often the cases, you don't know how the work that you do every day contributes to the business. You know, you've been working on this widget for months, and you're trying so hard, and don't know if it's actually making any impact. So I'd say that incentivizing people by giving them clear milestones to hit and having that kind of cultural story about how the features how the products are actually contributing to making people's lives easier.
Becky Buck 23:50
Yeah. And when you can make that tangible, so that it's not just, you know, make this feature faster. But it's help people accomplish this task in five clicks or less. You can get to those really tangible metrics for is it accomplishing that or not?
Steven Cohn 24:12
Great. What about Amazon? Any interesting stories about how Amazon kind of, you know, was outcome focused, you know, and really drove that? I know, they're very analytical, very data driven company. Just by anecdotes. I hear that. So my assumption was never working there is just that they're very, very metrics driven and, you know, organization. So can you share your experience? I understand that you left a little while ago, but you know, betos has been there the whole time. So my guess is that process in place, I'm sure has been optimized, but it's probably pretty, pretty consistent, you know, so
Becky Buck 24:52
also I was an outlier because in the same way that they are definitely very data oriented. It's also a company have really great visionaries like they're not scared to place big bets. So in actually the the team that I worked on, I joined when we were an org of three people. And there were a couple of post it notes, there was nothing to instrument or measure, there was nothing but an idea. And by the time that I had rolled off the project, there were users and live product, but we were still in the very early phases of any amount of product instrumentation or putting data in place.
Steven Cohn 25:28
Yeah. So how early did they start giving you? You know, outcome metrics? Or was it you know, you know, did they say like, the reason we want to do this is because of these metrics, like, this is what success looks like? Or is it kind of like, hey, this might be interesting, go explore.
Becky Buck 25:48
A bit of both, you know, it was interesting that we were working on an initiative that has significant funding and lots of buy in. And it was, it's live now, Amazon honey code, it's part of similar to like an air table, or these app builder products you folks working in tech may be aware, there's because there's such a shortage of engineers in the world, the world needs more ways to build apps. And they're starting to through things like air table and these like very customizable databases. So this was an example of a big bet, where the company knew that was a functionality and product that they needed. So there was a clear mandate to go figure out how to make that happen. And it was part of the UX team working with Product Management at the time to define who are the users for this product? And how do we start to quantify, like, what are the early indicators of success and adoption of the product? How do we start to think through when people get to task x? That means that they're starting to really, you know, it's clicking, that they're seeing the value that they're seeing how this is working?
Steven Cohn 27:06
Great. Okay, so, so combination of visionary, and then even early on setting some sort of metrics around kind of engagement or adoption rates that to show that I wouldn't, you know, and were those socialized among the whole team? Absolutely. Okay. Did you have some sort of like, like progress, like chart or something like that? Or is it just kind of like you check in once in a while, or,
Becky Buck 27:34
you know, as gosh, there's an interesting phenomenon in the Bay Area of being part of really high growth team. So like, in that year ish that I was there, we went from three people to like 120. So I'm sure there, and then we were in like three different geographic locations. So there probably is a million better ways for us to socialize some of that information and had more centralized databases for things. But that's also the type of thing where when you start to see success and find those correlations that say, when people accomplish x task, we're learning that's the tipping point. Those are the things that you broadcast wildly, and start pointing everyone at to say, this is the new priority, how do we optimize for getting people past this hurdle? How do we get more people to this point, where by accomplishing these tasks, we're seeing that they're getting the value of the product? They're finding their way through things? They're, they're having success? Hmm.
Steven Cohn 28:40
Interesting. Um, what about the skeptics? What about the skeptics who say, You know what?
Becky Buck 28:51
I'm one of them. Yeah, I am totally a data skeptic. You know, as a qualitative researcher, I'm often the one making the case that just because we're not measuring, it doesn't mean that it's not important and doesn't mean that it's not measurable, right. So and that's where I stand firmly on the need to triangulate data, like product usage, data is critically important to success. I don't know how anyone can be a good product manager without having that information. But it's also not a single source of truth or a magic bullet, you have to triangulate that information with all of the other resources that you have at your disposal, customer interviews, you know, qualitative information surveys, to figure out what really is happening that broader, more robust picture of what's happening.
Steven Cohn 29:41
Yeah, that's interesting. I, for some reason, when you said that the analogy that went to my brain is is baseball. I'm a big baseball fan, and like general managers, and you know, baseball is going through this renaissance of analytics analytics, but the general manager say, you know, and there's this tension between analytics and old school scouts. What's your qualitative they see with their eyes, I see the player with their eyes. And good gentlemen are like, it's both like, I need my analytics, you know, team and I need my scouts, you know, I need to put triangulate and put it together. And I think that, to me is a good analogy of how the pieces fit together, you need to have that qual and the metrics in the Quad to basically, you know, see the whole picture.
Becky Buck 30:23
I love that metaphor. And I totally agree. And that's where, you know, I use my skills as a designer and a qualitative researcher. I love frameworks and diagramming user flows, you know, potential decision trees, models, that I can work with a data scientist to say, Is this true? Is this accurate? Should we quantify this? Which branches are important? And I'm not the best person to go find those answers. But I love working with those colleagues to find that truth and find those answers.
Steven Cohn 30:57
Yeah, that's great in I think they can go in both ways, right? So I think metrics, you look at metrics, you're like, Man, that doesn't, something seems funny there. Let's go talk to users about this. And then on the flip side, you could say, well, we just had a call thing and his metrics were pretty good on this. So I like I don't see what's going on here. You know, there's like a bit of a disconnect between the call and the font sometimes.
Becky Buck 31:21
Absolutely. That's, you know, and I think that's the magic. And you talked about culture, when that becomes a moment of fun curiosity, to say these things aren't lining up? How do we figure out what's going on here? that's assigned to me of a very mature culture and psychological safety. You know, all of these buzzwords that we talked about making a great workplace? Because I think that's true. Like the the failure scenario is, when someone feels in the hot seat, and is trying to hide that fact, when something's not working, and they're trying to hide the data, or pad the data and show more success than might be there. You know, those are the tougher situations to work through. I love being in the situations where even if there's completely different stories from your datasets, getting into that mess, and saying, How do we know, like, what is going on here? How do we find a path forward?
Steven Cohn 32:17
Yeah, so I think that's, that's a big key is making sure people don't see analytics and data and outcomes as a threat. Culturally, you know, anyone who's been around product a while realizes that a lot of the stuff, if not a majority of the stuff that you launch is not going to have the outcomes that you expected or hoped for at least, that first version. Right. And, and that's okay. Right? That's okay. It doesn't have to be, you know, it doesn't have to be perfect. The question is, do we have the right culture, to be honest, candid about what was happening? And the teamwork and the comfort, you know, safe space to talk about it without pointing fingers blaming people? You screwed up on the design? You wrote a buggy code? You did this? You did? It's not about that. It's about understanding where what you know, what's happening, where the issues are fixing them, and continuous learning continuous, you know, growth as an organization. Absolutely.
Becky Buck 33:31
And that's something the the CEO of viral, I think, was masterful. And despite having coming from really big organizations, and transitioning into more of a startup culture, living in those two worlds of everything's broken, needs to be better. And oh, my God, this is amazing. Look at the progress that we've made, and we're going in the right direction. And you know, it's a, I think, a familiar challenge for every product manager out there. But a much harder and less familiar challenge for a lot of executives, that's a really uncomfortable place to be, because most executives are, you know, measured and evaluated on very precise binary metrics. So it's tough to ask them to step into that space of this is really messy and uncertain. Welcome.
Steven Cohn 34:18
You know, I will say, as a full time entrepreneur myself, it is tremendously less stressful. Once you accept the reality, that it's an iteration process. And just your first version is highly likely not to be the right iteration. And it's a process and as long as you're, you have that permeate your organization and understand that, that release, you know, dot zero is the start of the learning. You know, as opposed to the expectation that we just released dot zero and move on to something else. You know, I think once you have that, it's just it's a relief, right? Because it's like, you know, you're asking people to do something that's just not achievable to constantly release perfect versions that users love on dot zero,
Becky Buck 35:16
you know of that release. It's just never predict the future. I'm so bad at predicting the future. You know, with accuracy, I can often project some trajectories. But I think that's a, you know, we talked about what are some of the barriers to getting people in this mindset of having more data integration. And there's absolutely a fear around taking a guess at early stage metrics. You know, if you're working on a new to the company feature, it's tough to say, our goal is 30%, customer adoption in two months, because it's new, and you don't know. And the fear is by saying that, I'm setting myself up for failure, because I don't know if I can actually achieve that. So helping companies and helping product managers start to say, our goal is to baseline usage after the first month. That his success, having it in place, having shipped it, and starting to collect feedback on it in such a way that we know where to iterate. And we can wait until that point to figure out what is the baseline that we're starting from, and start to then have that conversation of, does this thing high? Does this seem low? And that's one way that it can feel a little bit safer for folks. I mean, there's nothing wrong with just drawing a line in the sand and planting a flag to say, we don't know, like, our best guess is, we've seen other similar features be adapted by like, 20% of our audience in the first couple months. Let's start there and adjust. And it really again, depends on the sort of cultural norms of the company as to whether or not that's a good approach.
Steven Cohn 37:00
Right. It's not about being punitive. Right. It's about saying, this is a goal. We don't know how realistic it is, because we've never done this thing before. Let's try to hit the goal. Right, right. And you don't want to set like super easy goals, because it's not hard 1% of users, I mean, you know, okay. About 20%, that seems hard. Alright, let's try, you know, let's try to see what happens. And if we can,
Becky Buck 37:28
you know, even, there's so many different norms around how to motivate people, you know, like, what is realistic, and some people are incredibly motivated by a hugely ambitious goal. And some people in the team might look at that and say, like, We're doomed for failure. Like, I don't even know I'm working on this, I have no faith that we're going to work towards that. So it also helps. Even just getting into the habit of having goals, and figuring out how you work towards them, can provide a lot of stability for the team, so that you have something that's ambitious enough for people to work towards. And people start to see that you iterate over time, that when something doesn't hit the goal, exactly as intended, the company has mechanisms for adjusting and dealing with that, and starting to ask questions of like, What does not meeting that will mean that we build their own feature? Did we set our mark too high? What if we adjust this? Do we get the outcome we need? And that's the nuance and maturity that everybody's really looking for is how do we use data to get to better outcomes? How do we use it to help guide us in these millions of decisions that we have to make throughout the day?
Steven Cohn 38:38
Perfect. I think that was great. I appreciate your time. Any last, you know, kind of thoughts on the topic or?
Becky Buck 38:47
No? I'm super passionate about this stuff. Like that's it triangulate data, like yes, instrumentation? Yes, yes. Yes. From the start, and also love your qualitative UX hours. They're trying hard.
Steven Cohn 39:00
This is a tough job. All right. Thank you very much. Thanks.