Tune into conversations with enterprise revenue leaders
here.
July 10, 2024
In this episode of The Revenue Lounge podcast, Randy Likas interviews Elliott Star, the head of data science at Asana, about reimagining the role of data science to create impactful data-driven applications for business teams. They discuss moving beyond traditional decision support roles to build scalable solutions that empower better data-informed choices.
Guest Bio:
Elliott Star leads Asana’s data science and machine learning efforts, enabling data-driven decisions and insights. He has over 10 years of experience in tech, with a background in robotics engineering. Elliott is passionate about applying his data skills to solve complex problems and create innovative solutions.
Key Quotes:
“If you want my job, you gotta cross train. You will cap out in terms of your data science knowledge requirements.” (00:39:55)
“I think the next maybe 12 months are gonna be the heyday for old school ML folks being like, hey. Yeah. Remember we can do that thing. Build a product layer and we can solve that problem.” (00:43:34)
Main Discussion Points:
– The evolution of data science and automation of previously complex tasks
– Shifting from traditional insights roles to building scalable, data-driven applications
– Identifying the biggest time sinks and pain points in the business to focus data solutions
– Taking an engineering approach and building business “ownership” of data products
– Data quality considerations for developing accurate models and tools
– Evaluating build vs buy decisions for maximum control and flexibility
[00:00:00] In the current business climate, you can’t go anywhere without hearing about AI and the potential of AI. But behind any AI project, there’s a data science team. And data science isn’t just about AI. Data science and the role that it plays, seems to be at the top of every priority across the business, whether it’s AI or not.
[00:00:16] But data science is no longer just about generative AI or even generating insights. It’s about fostering smarter decision making and driving efficiency through powerful applications and close collaboration between data teams and business leaders. Today, we’ll explore how data science leaders can move beyond traditional roles to create impactful application driven solutions that empower business teams to make better data informed decisions.
[00:00:39] joining me today is Elliott Starr. Elliott is the head of data science business at Asana. He leads data science and machine learning efforts, enabling data driven decisions and insights across the company. Elliott has over 10 years of experience in the tech industry with a strong background in robotics engineering.
[00:00:53] He’s passionate about applying his skills and knowledge to solve complex and impactful problems and to create innovative and [00:01:00] scalable solutions for his business. Hey Elliott, thanks so much for joining us today. Glad to. Thanks so much for having me. I don’t know that I’d call myself necessarily an expert in robotics.
[00:01:08] It’s been a while, but I appreciate the call back. A background. How about a background? I’ll take it. Great. Well, Elliot, I’d love to start out the conversation by just having you talk a little bit about, yourself. Your background and, your current role that you have at Atacama.
[00:01:25] Yeah. So I, like you mentioned, I did my graduate work in robotics. , my specialty was actually robotic dentistry. If you want to really freak people out, tell them that , their dentist is going to be a robot and watch their faces just go ash. , but, , I, transitioned into data science pretty early on in my career, kind of before it was a thing even called data science.
[00:01:44] And actually my first job in data science was called product intelligence which looking back on it, you’re like. What does that even mean? But it was at that point sort of the fundamental underpinnings of product analysis, A B testing, right, when it was first [00:02:00] coming out and sort of mainstream , moved over to the ML side for a while, got into management by accident and, , have kind of worked my way to where I am now, where I’ve seen a lot of different sort of playbooks and a lot of different ways of doing things at various size companies, written a few of the pages myself, and I’m currently in my role at Asana.
[00:02:19] Excellent. So, , data science has been changing so often, what’s been the most significant change that you’ve seen since you kind of entered into this world? And. , over the past couple of years. Yeah, most significant. I would say so much of what we spent our time and effort on 10 years ago has been either completely automated or significant majorities of it have been automated to the extent where folks coming out of school now learning data science.
[00:02:48] are learning the things that I was working on is almost like a historical artifact. And it really makes me feel old sometimes when people are like, Oh yeah, , hyperparameter tuning. I read about that in a class once. And I was like, I [00:03:00] spent countless hours doing hyperparameter tuning when it wasn’t easy to do and you had to like, have a theory on it.
[00:03:06] So I would say the transition of things we were working on to things that are now just historical has probably been the biggest. , and then in conjunction with that, I would say the sort of upskilling of technical portions. If you thought about, what ML state of the art was 10, 11, 12 years ago.
[00:03:22] , there was maybe like one or two people at a company who would be like, yeah, I like kind of know how to do this ML thing because I did my PhD on it. You know, I was called in specialty and nowadays, everyone who’s, picked up a data science notebook and do a scikit learn model from scratch pretty quickly.
[00:03:39] So that sort of technical aptitude is really jumped up a lot. Yeah, it’s fascinating to just think about all those changes. And like you said, you were The man on the ground doing all that stuff that these people read about in their textbooks. So, I want to talk a little bit about, , data quality at the top of our conversation.
[00:03:55] Obviously any data science initiative really has to be grounded in good data. [00:04:00] In your experience, who owns data quality organization? , we’ve seen organizations where they’ve got like data councils or data governance groups. Is that a sub function of data science or is it a partnership that data science has with maybe another area of the business that’s responsible for data quality?
[00:04:15] It’s a good question. I think it’s a partnership usually with your data engineering team who’s in charge of, basically transporting data around, but I would also say it varies based on the flavor of data quality on. And here’s where I will sort of drawn past experience having seen a lot of pictures.
[00:04:32] There’s a flavor of data quality for a, like, production system that is powered by data. So, I worked at Twitter, the Twitter timeline, when it was a ranked timeline, that was like, if data quality goes wrong, your timeline is breaking and it’s doing bad stuff. And the type of teams and the type of sort of partnerships responsible for something like that, and the data quality considerations there, Are ultra different than the data quality considerations for a team that is advising your chief [00:05:00] revenue officer on a sales play strategy across the team.
[00:05:03] And so like that kind of bifurcation is something that a lot of people who haven’t worked in a balance of production and advisory systems. Wouldn’t necessarily differentiate, but it really changes a lot of the things you care about the responsibilities and because you have very different sets of cross checks on each of those problems, you care about very different aspects of data quality.
[00:05:22] So you’re thinking about something like Twitter timeline. Timeline could degrade a little bit and it’s still okay. It’s not great, but like a little bit of degradation happens. Someone messes up a feature here or there. You have a pipeline that’s a little delayed, like it happens, but it’s like survivable and you’ll catch it through a lot of your monitoring mechanisms for performance and things like that.
[00:05:43] Admittedly, like your SLA on that has to be a lot better. You can’t wait around for three weeks while timelines broken to be like, Ah, we should like maybe wait for a data engineer to free up, but , you’re kind of dying a little bit of a bleed out on it. Whereas if you’re making a major marquee decision for a [00:06:00] company, you’re sitting down and saying like, we’re going to introduce our new sales strategy as a result of this.
[00:06:05] That’s a one shot sort of enterprise. Like you get one chance to make an influence that decision. You have a bunch of folks in the room who are going to have their own opinions on it. Which in many ways can provide a cross check for catastrophic data failure. But if you have something that’s even marginally wrong due to data quality and you get the wrong message across to people That’s a lot of work.
[00:06:26] You’ve got to undo I worked early on in my career for a company where Another data scientist had put together an analysis and it was the sort of top strategic driver for how the company made investment. And they’d made like a minor methodology mistake and there was a minor data quality issue. And as a result, it completely changed the outcome conclusion.
[00:06:45] You’d look at it and you’d be like, nah, this is pretty minor. Like, this wasn’t a big deal. Someone should have caught this on review. Someone should know this. But like in combination, it completely changed the strategy aspect. And even after they found it and corrected it and messaged it to people, you know, a year [00:07:00] and a half, two years later, people were still quoting the original flawed strategy to it.
[00:07:04] And so like in that aspect, your data governance really had to get key parts of it right. You couldn’t have a degradation. But if the strategy hadn’t been something that people were already otherwise excited about, it would have fallen flat. So there’s kind of like, your risk of downside is a lot greater when it comes to those one shots.
[00:07:22] But like some people might catch some of your mistakes whereas when you have a production system it’s always small constant little degradations that you just need to stay on top of and keep churning and it’s kind of like having a car that always needs an oil change, like the car is not going to blow up on you, but you always got to feed a little bit of oil onto the system in that instance, when you talk about where the kids kind of coming up with like the strategy, what role does data science play in that strategic conversation, as opposed to a little bit further down where, you know, maybe they’ve identified the business problem, identify what they’re trying to accomplish, and then they say they sort of hand it over to you, right?
[00:07:56] So what role have you seen data science play in part of that [00:08:00] strategic conversation? Yeah, if we’re having this talk 10 years ago, it’s a very different answer. 10 years ago, the role of data science was not super established in these discussions. It might be the case that you had business analysts across the company who were advising strategic leaders, they were making strategy calls, and maybe, just maybe, they would say, Oh, We need a slightly more advanced concept.
[00:08:22] We need a slightly more advanced technique. Call in a data scientist to like run this one particular thing for us and then send them off on their way. And I’d say the difference between now and 10 years ago is now a lot of leaders across the business, whether it’s technology leaders, revenue leaders, marketing leaders, expect a lot more of that.
[00:08:41] They want to approach most of their problems and say, yeah, hit it with some pretty heavy machinery from the start. And so data scientists are getting tagged in a lot more. For a lot of these kind of things, and it really blurs the line between what is ultra depth analysis versus what is a data science solved problem for a very [00:09:00] specific application versus what is advisory capacity.
[00:09:03] And companies that do it well, really utilize their data scientist. Well, companies that do it for the sake of like, well, we have to do this because this is how people do it. tend to kind of turn their teams out pretty aggressively. Yeah, I bet. So related to that, in an earlier conversation, you shared that traditional role of data science in decision support is currently underutilizing kind of its potential, right?
[00:09:24] Can you elaborate a little bit more on why you believe this? Yeah, if you think about what you’re doing as part of decision support, you’re trying to convince a human who has a relatively incomplete understanding of what you’re doing to change their mind about something. And you can be the best data scientist in the world.
[00:09:39] You can be the best communicator in the world. But if you encounter a particularly resistant human, game over, like good luck getting that person to change their mind or the person who you’re trying to convince ups and goes and gets another job and you have to convince a new person. And so a lot of teams that really make their bones by doing pure decision support are very much [00:10:00] at like the mercy of just like anything else that’s going on in the business.
[00:10:03] You know, your chief marketing officer decides that they want to totally change the marketing strategy and all the work you’ve done to support any past decisions is tossed out the window. You bring in a new chief revenue officer who has a new sales approach to things. All the work you’ve done is thrown out the window.
[00:10:17] And so a lot of times you see these teams have to like completely reset and completely re approach problems. Not necessarily because the modeling problem has changed or not necessarily because the data has changed, but because they now have a totally different person to convince. And I played that game for a while, and you just kind of realize that, there’s not a long term future in it.
[00:10:36] At a certain point, you become indistinguishable from a really expensive internal management consultant. And, you’re being brought in to advise and consult, and if people happen to go a different way, you kind of burned all that work for nothing. I still think it’s a key part of the overall data science team output and arsenal, but if it’s your only part, it’s not a sustainable business strategy.
[00:10:58] I like to think of my role as like, I’m [00:11:00] the CEO of the data science portion of the company. And I need to build a business plan that is a good business plan. And if my business plan is just run a consulting team, it’s not a very novel business plan. That business plan has been around for a while and it doesn’t do a whole lot in terms of good ways to expand good ways to invest internally.
[00:11:18] You know, it’s still a key part of the data science team output, but I index a whole lot more. And with my current team, it’s almost the super majority on actual things we can build and deliver for the company. They get constant daily use, that work even if we’re on vacation, that if the Chief Revenue Officer leaves and a new one comes in, they still got the same tool set available to them, and it’s still powering a lot.
[00:11:39] To me, that’s a much better business plan for a data science leader. , you mentioned business leaders might benefit from, more, well supported applications as opposed to deriving the insights from a lot of the data. Can you explain this perspective? Yeah. Let’s use a good illustrative example.
[00:11:55] So, I’ve worked over the course of my career with a lot of sales teams. And your sales leaders [00:12:00] will always want to say, Oh, you know, what’s an insight that helps me sell better? And the format of that might be a report. It might be, a walkthrough of some people. Sales folks tend to cycle around a lot, so you have to constantly rephrase it.
[00:12:13] Sometimes they don’t believe you. Inevitably, one country territory doesn’t believe that the technique that they’re using in a different territory works. It’s a weird set of comms and approach and , you can almost never retire from it. Whereas, if you approach the problem and said, Hey, salespeople are making decisions about how to sell.
[00:12:32] Instead of giving them the insights, why don’t I just build something that recommends things for them directly? And 11 years ago, I built my very first Rexis recommending sales plays to salespeople. It was a hack day project. It didn’t really go anywhere. But , that kind of a tooling, And that exists and it says, cool, I’m going to solve a key problem in sales.
[00:12:51] I’m going to build something that does it at scale. And my job now, instead of explaining, Oh, salesperson X, like here’s top drivers, here’s things you do. [00:13:00] My job now becomes, okay, how do I improve the performance of that? How do I make it more stable across teams? How do I get better data sets so that it can be more informative?
[00:13:08] How can I have a good, automated explainability solution around this thing that I’m building and it becomes much more of an engineering technical problem than a sort of people and personalities management problem and from experience. Even though data scientists like to capitalize the scientist portion, they really are pretty happy when they’re doing a lot of engineering and build effort.
[00:13:28] Like, some of the happiest data scientists I’ve met are the ones who are entirely building things because they get to see it out in the wild. They don’t have to worry about well, did my consumer have a bad burrito at lunch today and wasn’t paying attention to my presentation? They’ve built something that’s good and out there.
[00:13:42] So, I think if business leadership is really paying attention and really doing a good job, I think they’re They can actually request really solid products to be built that can automate and or reduce arbitrariness in major parts of their business book. , I don’t think I’ve ever had a chief revenue officer or a VP [00:14:00] of sales right now come to me and say, Can you build a recommender system?
[00:14:03] But like the first one who does, I’m going to be like, yeah, and it’s probably going to power a significant lift to your sales. So usually I’m the one suggesting it, but the more that business leaders are able to say like, Hey, here’s this thing that can build really powerfully in my organization and do a lot of the heavy lifting for me, , those folks are going to win in the long run.
[00:14:25] So you may have answered this question just now with the sales example, but can you share an example of like applications that your team has built that have helped enhance that decision making process for business leaders? Yeah. So, going back to it, it’s a little less about enhancing the decision making process and in some ways removing the decision making process.
[00:14:41] Every decision that your leaders have to make that they’re making either on something relatively inconsequential or they’re making with incomplete information represents a potential loss for your business. Like someone’s going to get it wrong. Someone’s not going to have complete data. If you can give them a much more automated way of taking a lot of those decisions out of their [00:15:00] hands.
[00:15:00] They can devote their time and effort to the decisions that are actually the hard decisions Like what is your global sales strategy is something I want a chief revenue officer spending most of their time on What is our unique strategy for? manufacturing in France. I don’t know that I want to achieve revenues.
[00:15:17] They’re spending a lot of time thinking through the specifics of sales place for that. It’s not a good use of their time. If I can take a lot of those decisions and just take them off of their plates, if I can take it off the plate of GMs, if I can take it off the plate of frontline managers, to me, that’s where the value is.
[00:15:32] So it kind of reminds me of, and I don’t know why this analogy is coming into my head, but, you know, you think of the famous Wayne Gretzky quote, which is , don’t skate where the puck is going. It almost seems like data science is helping the business leaders figure out where the puck is going.
[00:15:46] It’s like, they don’t quite know what to ask for, but you’re able to say, Hey, I can build this for you, which is going to take this off of your plate. And they didn’t really know that’s something that you could build. Would you say that’s accurate? , it’s very accurate. And I think.
[00:15:59] What’s [00:16:00] really interesting is a lot of people think that, Oh, you know, leadership in business doesn’t know that they can build this because they don’t have the deep technical and machine learning based knowledge. And that is true, but it’s not the blocker. Like the blocker is usually being able to have a team that can represent and build things for you that can automate a time.
[00:16:19] , if you talk to a chief marketing officer and you say, Oh, do you know the difference between, , instrumental variable. Experimentation versus, you know, synthetic controls or whatnot. They’re gonna be like, Womp, womp, womp, womp, womp, womp, womp, womp, womp. But if you tell them, Hey, did you think that there was a way that maybe we could do some variation across our spend and figure out the optimal blend mix without a human having to be involved?
[00:16:44] They kind of all know it intuitively. Like you explain it to them and they’re like, Well, yeah, I knew that. But they never ask for it. And so like, I always go to people and I’m Like start asking for pretty out there things and they’re like, okay, can you do analysis on X? I’m like, not out there enough.
[00:16:59] Like [00:17:00] still inside the box, toss the box out, start taking really big shots at things. Yeah. So it’s almost like you’re helping them see not just the request, but the value, right? What’s the value of the application that you’re ultimately able to build for them. And so, You know, I imagine , like having those conversations and having, get it out of the weeds and get it up to like the business value that’s going to bring them, , is a big , part of the job.
[00:17:23] So what’s the, some best practices in order to help communicate that value or communicate what the request is as opposed to what can it deliver? Yeah, I think one of the most important things for data scientists to understand, cause you know, not most of us don’t have MBAs. We haven’t necessarily worked in the business minds for that long.
[00:17:41] Understanding where time and effort goes in your business. Like if you understand that you immediately know where to optimize. I love to use the analogy because folks have seen in the past, if you’ve ever pulled up like, , you know, a performance console on a website and you can see like, Oh, like it took this long to load this section and wow, all of our time was spent loading these [00:18:00] video assets.
[00:18:01] It’s pointless for me to optimize these other things like this is the one big thing there , if you can understand where your business folks or where your partners spend the vast majority of their time and effort That is your biggest target market to go after like going after like oh this, you know I use the analogy of like at the end of quarter.
[00:18:19] We get a lot of questions from teams I’m like, Great. This is appropriate for the two hour end of quarter presentation and it happens four times a year and people forget about it by the next time. That’s not a major market to go after. But I look at things where, you know, a salesperson, a marketer, a customer support person, a product manager is doing that sort of thing every day, couple of minutes a day, even for an hour a day, throughout the entirety of the year on a constant rotation.
[00:18:44] That’s your market that you make there. And for data scientists to understand where their appropriate market is, it can be very different than what you hear. So like you, you have to differentiate between what you’re hearing from leaders about what they want and where the market actually is. And so a huge part of my job as a leader, [00:19:00] what I tell others to do is go do that market discovery.
[00:19:02] Like, go look out and be like, Hey, did you know that our salespeople spend 70 percent of their time deciding what sales play to use for things? That’s a lot of time. That’s a lot of value generates a lot more than you’ll potentially hear from a sales GM Who might be like, oh we need a specific approach for manufacturing all hypothetical examples Of course, but if you can identify your market That’s your opportunity to build much bigger products.
[00:19:29] So what challenges do you have in shifting that mindset of, I just need a bit of an insight on something as opposed to, let’s think about it a little bit more holistically in terms of like the building, the application that can help you get there. , I guess the question is more like short sighted sort of, I just need the answer to this question as opposed to help me build a system that can actually make, save me time or be more efficient.
[00:19:53] Transcribed Yeah, I’d say you end up having to say no a lot as a leader because you’re going to get [00:20:00] more requests than you can handle for insights and Every time you say no, you’re playing a bit of Russian roulette Like can you say no to a frontline manager in manufacturing in France? Sure. Can you say no to the GM of America’s?
[00:20:18] Questionable career decision there. And so like a really hard part of it is like most data science teams get judged based on how much their customers like them, especially on the business side of things. And likeability is really tough when you’re saying no a lot. And there are some leaders who are like, Oh, I’ve learned how to say no in such an awesome way.
[00:20:38] They’re like, people still like me. And I’m like. Power to you, man. That’s awesome. I would much rather just, , say no a bunch and get people to understand that, maybe they’re asking the wrong thing. And as a result, , I’ve had business people, and I won’t speak to my current company, but like, business people in the past have hated me.
[00:20:53] They’ve been like, don’t call that Elliot guy. , he always says no, he’s such a jerk. And I can [00:21:00] either Let it go and work with partners who want to work with me or I can try to convince them like hey Like me saying no is for your benefit. I’m not saying no because I don’t like you I’m not saying no because I don’t want to do this work.
[00:21:11] I’m saying no because I think I can do something better for you And that’s a mixed proposition, you know, I would say my hit rate is about 50 50 across the last decade of my career on getting people to come around to that. And some people aren’t willing to flip that coin. Some people say , no, like I want to make sure I’m in a position where everybody has a good opinion of me.
[00:21:30] I’m doing that. And you get a lot of data science teams and leaders who build up a really high quotient of goodwill. And sometimes they spend it down. Sometimes they don’t. But by building up that heavy portion of goodwill, they’re kind of limiting their playbook. And I’m, and no, no fault to them.
[00:21:46] They’re making good calls for them, but I would much rather see your data science leaders flip the coin a little more, roll the dice a little more on things. Because I think the expected value is more than what you’re going to get with playing the safe game. [00:22:00] So, I’m interested in this, you know, the discipline of being able to say no on things, right?
[00:22:06] Because you say no, you’re going to have somebody who’s probably going to be a little ruffled up because they have the question. And I would assume you probably have a bit of a framework or something you refer back to as like, How is this? ladder up to the overall business objectives of the company as opposed to just like your pet project or your specific request.
[00:22:23] So , what’s, I’m interested in understanding like, when you say no to something, , what’s the framework you use , to help decide, hey, this isn’t really important as opposed to, taking it on. Yeah. Internally, you know, even before I talked to someone, I’m looking at like, what is the major swing or outcome of what they’re doing and if it’s, relatively inconsequential, I’ll try to be like, ah, , this is a little below the scale of our team.
[00:22:48] I think it was Vince Vaughn, I think in Mr. And Mrs. Smith, who like, he gets the text message that he needs to go hunt down Bradford. And he’s like, No, I ain’t getting out of bed for anything under half a mil or over half a mil. And I sort of [00:23:00] like to put that, , kind of word out where people are like, Oh, can data science help me with this?
[00:23:02] And I’m like, Yeah. Yeah, but we don’t really get out of bed for anything less than, a million dollars of value and a lot of people will like do that self check and be like, oh, okay, , this probably isn’t there. So there’s an element of advertising what your team is there for that can deflect a lot.
[00:23:17] I think there’s a secondary aspect, which is building up your sets of partnerships. So, some companies will say like data analysts and data scientists are the same thing. I like to draw a really hard distinction. I like to say that data analysts are there to do that analysis for the company. They could arguably have better, greater analytical capacity than your data scientists do because you want your data scientists to focus on building things and doing that thing creatively.
[00:23:39] So I like to really work in partnership with an analytics leader and say, Hey, like anytime people come for insights, I want to send them your way, but I want to quid pro quo that anytime you hear something multiple times. You’re going to decide that , this is a really good option for a product and try to upsell your partners on working with us to build a product.
[00:23:57] And that kind of like being able to route [00:24:00] traffic to each other. If you have a really solid partnership, a, it’s mutually beneficial B you generate a ton more goodwill because you’re able to pass things across teams really effectively, instead of saying, eh, this isn’t for us. Go deal with the analysts. , and you can get some pretty meaty projects out of it.
[00:24:15] If your analysts, especially ones who are really engaged and curious, We’ll do a really good job of problem discovery for you. Like some of the best business facing models that my teams have built have come from something where an analyst is like Hey, I got this question like three times in the last two months, like it feels like there’s momentum there and like they weren’t necessarily sure exactly what it was, but they had enough on their radar that like someone looked at that on a day.
[00:24:39] So he was like, Oh, snap. Yeah, that’s a big thing for us to solve there. So I like to do it in partnership. I like to put out early comms. And then when people are really insistent, I don’t know, I’m a little more of an abrasive personality, but sometimes I just go to them and I’m like, Hey, , thanks for calling, we ain’t the team to help you.
[00:24:58] Like, really sorry, it’s not [00:25:00] personal. , and I think that sort of like, clean sort of letdown is a lot better than leaders who would be like, Oh, you know what? We’ll happily put this on our priority queue with no intention of ever prioritizing it at all. Oh, , we love to get involved in this, but we’re really tapped out in this area.
[00:25:13] So like we can get back to you because then that person looks at that and says , Oh, well, like I asked the right thing. I came to the right team. They just couldn’t help me now, but like maybe in the future, if I have someone who comes to me with something, I don’t want my team doing. I don’t want them ever coming back with a request that’s similar to that.
[00:25:29] I want them to know, hey, this is not our market. Do I want them to still buy our other products? Yeah. So I might turn it into a bit of a sales show. I’ve been like, well, we can’t help you out on that, but if you want to come check out our awesome set of tools on the shelf, like more than happy to give you a tour.
[00:25:46] But I hate having a customer come back to me and be like, I know you couldn’t help me out with that analysis last time, but can you do it this time? If that happens to me, , I’ve failed as a leader cause I just haven’t, I haven’t gotten the right message out. Yeah. Or they came back to you with the same [00:26:00] question, just asked a little bit different.
[00:26:01] And you’re like, no, it’s the answer is still going to be no. Right. Yeah. So what advice would you give to other data science leaders in B2B SAS? Who are looking to, maybe their organization is kind of in this mode of like, We just want insights as opposed to building, an application driven strategy, to help communicate what you’ve been able to communicate to your leaders, like tactical advice, what might you give?
[00:26:23] Yeah. So like I mentioned earlier, the biggest thing is understanding the big markets at your company. You can give all sorts of great suggestions. You can talk to your team and be like, Hey, everyone come up with fun suggestions. But if you’re not hitting a major market in your company and you’re trying to get a program like that off the ground to start It’s gonna fall flat because people are gonna look at that and be like you just pulled off You know three analyst months of support for us like What did we get in exchange?
[00:26:46] Oh, we got a like nifty thing that like sends emails out with slightly higher open rate, not impressed. So you got to figure out where your highest leverage points are. And then you as a leader have to almost be also [00:27:00] the director of product. I’ve been really fortunate in the past. I’ve actually worked at companies where.
[00:27:04] People have had data science PMs who’ve done that producting and they’ve gone out to portions of the company and done that market discovery And done the product design and said hey, here’s how we do user acceptance testing Here’s how we do all that. If you don’t have that because you’re starting a program from scratch You have to figure out how to do that.
[00:27:20] I was really fortunate One of my very first jobs in tech was as a product manager I did a rotation in the Microsoft PM lines as an intern and like you learn a lot of basics on like Here’s how to, come up with a product idea. Here’s how to, think about product things. Here’s how to ship a product.
[00:27:36] And if you’ve never done that as a data science leader, and you want to set up a program like this, you’re going to have to, learn while the plane’s crashing kind of thing. Because you can’t rely on anyone else to do it for you. Unless you’re lucky. Unless you find some amazing person in the business who’s like, I will be your shepherd.
[00:27:52] I will guide your products in. But you can’t just assume if you go out and build cool, shiny things that people are going to want to come take them for a spin. You really need to do [00:28:00] that level of product advocacy, customer adoption, that, if you haven’t been trained on and you haven’t been coached to do, you’re not going to come to natively.
[00:28:07] So I like to think of it as if you’re getting this program off the ground, You are both the director of data science and the director of data science product management at the same time. You mentioned something earlier about some of the projects you worked on with sales leaders. At Nectar, we work with go to market organizations, right?
[00:28:23] So our stakeholders are going to be sales, marketing, customer success. And most of the time, , Rev Ops tends to be like the stewards for data in that function, right? So they’re the ones who are building the processes and implementing the tooling to make sure that they can capture the necessary data for the business to make the decisions.
[00:28:40] But even then, like data quality is difficult, right? There’s never, there’s always going to be missing, incomplete, stale, siloed data, right? And so There’s never gonna be perfect, but it always surprises me how many organizations have lived to accept that they’re always gonna be operating in this world of , not the best data.
[00:28:59] , and so, how does that [00:29:00] impact your ability as a data scientist that then go build the applications for the business? Like predictive sales, like sales plays, you know, they should go do. How do you balance? That incomplete or missing or stale information to build the right application when maybe the right data is not there for you.
[00:29:16] Yeah, it definitely limits the sort of scope and complexity of what you can do. You default to slightly more safe model architectures. You default to slightly less complex features. Forget about vectorizing anything if you’re dealing with, human input data. Your vectors are going to break and you’ll never know.
[00:29:32] , so it definitely limits your toolbox a little bit. I think what you do is you find ways to transition a traditional team like RevOps that would do this sort of thing around data and make them think more like an engineering team. Like when you have an application like a sales recommender system, you know, the one I built a decade ago, like you, you tell your RevOps team, Hey, This isn’t something I just built and you guys stare at idly.
[00:29:55] Like you guys are a part of this, like this system only works because you are the [00:30:00] ones who are, , helping to drive it. And so you really give them skin in the game and you say, Hey, like. The success of this is going to be based on the data quality inputs we’ve got and how we can make improvements there.
[00:30:09] And I can show you that investments you make in data quality improve this system that we’ve got. If you make it so that they have that level of ownership, no one wants to own low quality data. They might look at it and be like, it’s not a priority because, you know, humans can look at this and be like, Oh yeah, like they accidentally put a zero instead of an O, but a human reading it is fine.
[00:30:30] If you give them a really complex. Highly technical asset. That’s really dependent on these things. And you say, Hey, like this, isn’t just something that you bought off the shelf that like you can complain about. You are an owner in this. I’ve had good experience of them actually stepping up and saying , no, like if this is the thing we own, we want it to be good.
[00:30:48] Like no one wants to own a bad product. You want to own something good and whatever you can do that contributes to that. You’ll make contributions. So if RevOps team, where the main lever they can pull is. [00:31:00] Some of the manual, quality entry or like things like that. If they really feel ownership of it, they’ll step up their game.
[00:31:06] They will bring that. And it’s not like they don’t have the ability to, it’s just no one’s given them an incentive to. So if you build a really complex thing, you say, you are a co owner of this with me, it’s success or failure is based on your, A plus efforts in this particular area with data quality.
[00:31:21] I’ve been really impressed at how most of the partners I’ve worked with have stepped up and been like, well, yeah, shoot, if I’m going to own this, , We’re going to do it right. Gosh, darn it. What role do you, does data science play in , the build versus buy decision, right? So you probably have business stakeholders that come to you saying.
[00:31:37] I found this really cool product that can do this. And you’re like, yeah, but can’t do this. Right. I guess the question is when it comes to making decisions of buying it off the shelf package for you saying, I think I can build something for you a little bit better.
[00:31:49] What role does data science play in that purchase decision? I’ll warn you. I’m super biased. I’m super against buy and super pro build. , and that could be like the old [00:32:00] school engineer in me speaking. I mostly do that from the basis that The total cost of ownership over a life cycle tends to be dramatically underestimated for most buy operations.
[00:32:12] And if you buy too much, you have almost an entire set of folks who are just there to integrate things you’ve bought and try to make them cooperate and play together. And you end up doing the math five years later and you’re like, it would have been better for us to just build some of these on our own because they wouldn’t have had all the faults.
[00:32:28] And yeah, we would have had to support engineering teams who want to do that. But It might’ve been the better call. I also like owning my desk anymore. You’re relying on a build tool and they have an outage. You’re sitting around waiting until they fix their outage. Whereas you build something and you have an outage.
[00:32:43] If you have a 24 7 on call rotation, you can have folks start working on it immediately. If you have good alerting, you can start hitting that. I’m almost like ultra biased towards having that destiny in my own hands. And I recognize there are a ton of good products out there. And there’s some applications where you can say, Okay, I’ll sandbox this, I’ll silo it away.
[00:32:59] It’s a [00:33:00] good build. This is something that’s ultra niche specialty. I couldn’t necessarily hire folks to do it. But I feel like that should be more the rarity than anything else. I would much prefer people building more things internally. And of course, all the vendors are now going to come after me and be like, but we can build such great things and they’re all stable.
[00:33:16] And I’m like, yeah, I know. I get it. I get it. But I’ve just always been biased towards building things. And I think the role that data science plays in that is helping people to understand, Hey, You’re getting a sales pitch from the folks on this. Maybe they’re putting a solutions engineer to walk you through more detailed tech stuff.
[00:33:32] But like how does this fit into your long term technical strategy? Like how do you know that the company we’re buying from’s roadmap aligns with your interests going forward? How do you know that like the features you’re going to want or get prioritized? That’s, I think, where data science can say, Hey, we’re not necessarily trying to compete with them, but we’ll offer you a complimentary option.
[00:33:51] Something that says like, Hey, you know, we can build something that’ll help this out. We can take some of the load off of that. And maybe if you give us the chance, maybe we [00:34:00] can build something a little cheaper, a little more flexible internally. And if we don’t have the specialty for it, we’ll confess that and we’ll say, Hey, out of our strike zone.
[00:34:07] Shouldn’t go out and hire someone to do it. We can buy it, but we can put a lot of the scaffolding around it. So I think you’re partially informing, you’re partially giving alternative options, but I also , don’t like buying things in the case where, maybe , they do what they do buy, and you know, are you playing, like, are they involving you to be like, It’s for you to say, if you’re going to buy this, at least make sure that they’ve got the capabilities to share this and, share their data that they’re collecting into like Snowflake or another data warehousing application.
[00:34:37] , do they have , those capabilities built? , are you playing that role at all as making sure that baseline requirements, at least I can have use of this data or what role would you play there? Yeah, you definitely play a role there. I think in terms of system interrupt, You cross your fingers and hope that your data engineering partners get that for you because a data scientist figuring out whether a platform interrupts with your database is a little out of their strike zone.
[00:34:57] The thing I most look at is , what things does it [00:35:00] close us off from? So you know, if we’re thinking about, I think back, , I used to work in fintech and , there’s a lot of companies that will offer you fraud scores and people are like, yeah, we can get a score from this company and we can use it to block transactions.
[00:35:11] And I’m like, Eventually, we’re going to want to do that on our own. And this company that’s doing a fraud score has a bunch of underlying signals they’re using to generate their score. I’d love to get their underlying signals. Cause we’re buying it already. , we’re buying it in a package. I might as well just buy the ingredients and, , be able to make the meal myself.
[00:35:29] And that’s also great because what if they change their score? What if they know you’re going to have a V1 score. And then two years later, they’re like, ah, we’ve now got a V2 score. It’s so much better. And you’re like, cool. Objectively better, maybe, but like now I need to reorient everything we do around that score.
[00:35:46] Does it have a new baseline percentage? Do we have to recalibrate it? If you were changing the feature inputs, , is that now can I compare histories against that? So I really coach a lot of the buyers. I’m like, Hey, like here are the things to look for that [00:36:00] don’t eliminate your future opportunities.
[00:36:01] If you’re relying on it as a feature input, can you always get it historically? Can you get it unchanged historically? Can you have an API that says we need that score of what it was three years ago? Cause that’s how we’re going to train our own ML model. And a lot of companies will be like, Oh, well that was our three years ago model.
[00:36:15] We’ve got the brand new one now. And I’m like, I don’t care about the brand new one. I need to know what the score was three years ago because we have a lovely true data set of fraud from three years ago. So a lot of it is coaching of like, you need to look for these things so you don’t box us out of future.
[00:36:30] Yeah, that’s fascinating. I have a set of questions I want to sort of transition to in what we call our lightning round, which is a little bit more of a, , fun and personal take on some stuff. So let’s drop in. So what’s one book that maybe you’ve read, in the recent past that you really love that you might recommend?
[00:36:47] I’m a big Nate Silver fan, Signal in the Noise is a great way to approach data problems without getting too technical on them, and he’s such a great writer and explainer. I really enjoyed that one. , I recommend it to all my non data science friends who are like, [00:37:00] what’s data science about?
[00:37:00] , it’s not quite this, but if you can think this way, you’re getting a leg up on the crowd. I’m going to add that to my list. I love it. , what’s your favorite part of being a head of data science, , at your company? Oh, favorite part of being head of data science in my company.
[00:37:14] I think, , I get to propose some pretty out there ideas and I get a lot of people who want me to tag in on their stuff because they’ve seen the value of some of the out there ideas. So I’ll get folks from engineering teams, from product teams, from business teams coming to me like, Hey Elliot, , I’ve got a crazy idea.
[00:37:32] And yeah, I love hearing crazy ideas. That’s fun. , there’s always a flip side to that coin, right? Which is the least favorite part of the job. I , I have to do a lot of time explaining what is data science. If I had a dollar for every time I had to explain what is data science over the course of my career, I would have retired years ago.
[00:37:48] What do most people confuse data science with? Very heavily with. analytics and analysis. And I really hate it when people say, Oh, my analyst did this thing, but data science, can you do better? I’m like, data science is [00:38:00] not better data analytics. They’re separate disciplines. They have separate focus.
[00:38:03] Your analyst should be better at some things. Your scientist should be better at other things. Please try not to confuse the two. Data science is not it, right? Yeah. What advice, have you received from someone that’s, stayed with you and, you’d maybe want to share with the audience, , learn how to take a punch, not physically, probably a good life skill, but metaphorically, , if you’re a data science leader, you’re going to have great ideas that you are 100 percent certain that do not land.
[00:38:30] And if you take it personally, and you let yourself get down about it too much, you’re Like, I can guarantee you, it’s going to happen again, and you’re going to feel that bad again. And I’m not saying, you shouldn’t feel bad, but, , learn to be like, Okay, I did the best I could on this one. The partner I’m working with, or, , this initiative just didn’t land, and it sucks, and I’m going to remember the things that I did wrong from the last time so that I don’t make them again.
[00:38:52] But, you can’t let it own your life. That’s really good advice. , what’s one piece of advice that maybe you’d give to somebody who’s [00:39:00] just graduating, starting their career in data science, and they want to be in your position someday? I would caution them against that. , I feel like if I was an IC data scientist right now coming out of school, I’d be so much happier doing IC engine data science work.
[00:39:13] Like I think about the work I did when I was coming up, like so many tools that support so much automation for them. The job just got to be a lot more fun. , but if they did want to get to my level, I would say. You will cap out in terms of your data science knowledge requirements. I have lots of data scientists who work for me who are better data scientists than I am now, or I ever was.
[00:39:33] And I didn’t get to where I am now by being the best data scientist. I got there by being a sufficiently crappy data scientist and by having enough knowledge of business practices and by being able to work with people and by being able to lead. And like a lot of people are like, Oh, I need to improve my career and make more money by being better at doing data science stuff.
[00:39:52] And I’m like, Yeah, that will not get you my job. That’ll get you principal scientist at, a big company or , lead data scientist or something. If [00:40:00] you want my job, you gotta cross train. And what’s the best way to do that? , is it just like making your, , building partnerships across the organization?
[00:40:07] How do you get that cross training, or at least put yourself in those situations in which you can gain that experience. , what advice might you give there? You got to get reps. You got to get exposure. You got to do some stuff that is way outside your strike zone.
[00:40:19] Like how many directors of data science have done a sales call and tried to sell a external customer on the product you’re done. I’ve done that. I didn’t close the sale because I’m a terrible salesperson, but like I’ve sat there and I’m like, okay, like I now have a very intimate personal experience of , here’s what it’s like to cold call someone to try to sell something, you know, am I going to revolutionize sales based on one call I’ve done?
[00:40:41] No, but at least now, like I know a little bit of the bare bones of what’s going on. And if you want to be a leader across lots of disciplines, You have to know some of the bare bones of those disciplines. You have to have sat down with a marketer and tried to adjust the creative for a campaign. You’ve got to have sat down with a salesperson, tried to do a cold call sale.
[00:40:58] You’ve got to have sat down in a [00:41:00] CX role and tried to debug a customer issue live on the phone where you have no idea what’s going on. If you haven’t put in at least a little bit of skin in that game, you know, I’m sure there are folks who are really smart and really talented who can get by without it.
[00:41:14] But it’s just such a good set of investment to make in your career knowledge. Yeah. And is that something that like in your experience that you had to raise your hand for and say, I want to do this, or has it just naturally happened where you’ve been involved in sales calls or been involved with the marketers?
[00:41:28] It’s been a mix. Sometimes you do have to raise your hand and say, I would like to do X. Sometimes you have a company whose culture is really based upon that. I’ve worked at startups where they’re like, we want everybody as part of their onboarding to shadow us, you know, a customer support person for an hour or two, so that they have empathy for that.
[00:41:44] I think companies that have that sort of culture tend to do pretty well across those, but it’s not a requirement. Sometimes it’s just not scalable. , but yeah, you have to be open to it. And sometimes you do have to go out and be like, Nope, I want to go on a sales trip. Like I want to go meet a customer on site.
[00:41:58] I want to sit in the room. I want to [00:42:00] hear the negotiations. I want to see us walk away dejected because we didn’t get the sale. Tag me in. I want to do that kind of thing. Yeah. Yeah. That’s great advice. All right. Last question, Elliot. All right. What gets you really excited right now about the world of data science?
[00:42:11] Maybe, you know, what’s coming or the potential, like what does just get you that gets you really excited right now? Yeah. , so the gen AI hype of last year, that’s carried over. So to me, and a lot of data scientists, they looked at it and they were like, this was an incremental advancement on a problem that people have been solving for a while.
[00:42:27] And what I thought was really cool was it came with a product solution. And a lot of old school ML folks were like, we’ve built great ML for decades. It’s comparable to this, but you had to build a product solution on top of it. And a lot of people just weren’t willing to build a product solution on top of it.
[00:42:42] And I think now a lot of folks across businesses are saying. Yeah, OpenAI, Anthropic, etc. They built a snazzy chatbot product solution on top of it. A bunch of startups are building these snazzy product solutions. But maybe it’s not right for us. And a lot of people are waking up to , Hey, maybe we need to build our own [00:43:00] product solution too.
[00:43:00] And so I feel like the next, maybe 12 months are going to be the heyday for old school ML folks being like, Hey, yeah, remember we can do that thing. , Build a product layer and we can solve that problem. And I think it might be a sort of like second era of really good ML products. Fantastic. Listen, Elliot, I really enjoyed this conversation.
[00:43:19] We typically, as I said, talk to go to market leaders, but your perspective on the data science thing , was something that, , I learned quite a bit. , and I think it’s gonna be really great for our audience as well. So thank you very much for your time. I hope maybe next time I get out in the Bay Area, we can grab a coffee or something.
[00:43:32] You name it. Thanks so much for having me and best of luck on things. All right. Thanks, Elliot. Have a good day. You too.
This episode unpacks the Flywheel methodology, explore how it differs from the traditional funnel, and discuss the leverage points across departments that can fuel full-funnel revenue impact. Our guest is Anil Somaney. Anil is the worldwide head of RevOps at Island. His “Flywheel” methodology focuses on finding small leverage points across the customer lifecycle to drive revenue growth, business velocity, and increased efficiency & profitability.
This episode dives into what it takes to build a governance framework that doesn’t just keep IT operations in check, but also aligns technology investments with strategic objectives. Our guest today is Bill Vanderwall. Bill is the former VP of Business Applications at Cision. He has deep expertise with SaaS applications and cross-functional business process solutions, including business intelligence and analytics.
Cookie | Duration | Description |
---|---|---|
cookielawinfo-checkbox-analytics | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics". |
cookielawinfo-checkbox-functional | 11 months | The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". |
cookielawinfo-checkbox-necessary | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary". |
cookielawinfo-checkbox-others | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other. |
cookielawinfo-checkbox-performance | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance". |
viewed_cookie_policy | 11 months | The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data. |