Episode 210:

From Reporting Basics to AI Automation with Eric Dodds and John Wessel: Navigating the Complexities of Data Standardization, Observability, and Business Alignment

October 9, 2024

This week on The Data Stack Show, Eric and John talk about the complexities of reporting and analytics in the data industry. They discuss the evolution of job titles, the critical need for standardization, and the challenges of effective reporting. The conversation also highlights the importance of observability, consistency and effective communication between data and business teams. Eric and John also touch on the impact of AI and automation on data quality and reporting processes and emphasize the need for a collaborative approach and robust monitoring systems. This episode provides valuable insights for enhancing data-driven decision-making in various business contexts and so much more.

Notes:

Highlights from this week’s conversation include:

  • Reporting and Analytics Discussion (1:09)
  • Automation in Reporting (3:16)
  • AI’s Impact on Analytics (5:00)
  • Data Quality Challenges (6:56)
  • Reinventing Reporting (9:23)
  • Automated Reporting Services (14:35)
  • Growth Trajectory of Reporting Tools (16:01)
  • Market Size Comparison (18:04)
  • Static vs. Time Series Data (21:27)
  • Differentiating Reporting and Analytics (26:26)
  • Ad Hoc Analysis vs. Reporting (29:52)
  • The Role of Data Scientists (34:03)
  • Planning the Reset (38:36)
  • Focus on Business Problems (41:30)
  • Identifying Business Needs (44:01)
  • Heuristics and Intuition (48:03)
  • Importance of Monitoring (52:20)
  • Observability in Analytics (55:53)
  • Observability in Metrics (58:08)
  • Consistency in Definitions (1:02:39)
  • Impact of Changing Definitions and Final Takeaways (1:04:15)

 

The Data Stack Show is a weekly podcast powered by RudderStack, the CDP for developers. Each week we’ll talk to data engineers, analysts, and data scientists about their experience around building and maintaining data infrastructure, delivering data and data products, and driving better outcomes across their businesses with data.

RudderStack helps businesses make the most out of their customer data while ensuring data privacy and security. To learn more about RudderStack visit rudderstack.com.

Transcription:

Eric Dodds 00:06
Welcome to the data stack show.

John Wessel 00:07
The data stack show is a podcast where we talk about the technical, business and human challenges involved in data work.

Eric Dodds 00:13
Join our casual conversations with innovators and data professionals to learn about new data technologies and how data teams are run at top companies. Welcome back to the data stack show. We have a little bit of a special episode for you today. We’re recording us straight to the computer without internet because we had severe weather in the southeast, and it’s caused a pretty big mess for a lot of people, for a lot of people in the southeast, including us here in Greenville, our hearts go out to everyone who’s been affected by this. There are lots of people without power. There are lots of people, especially in western North Carolina, who are in danger. So definitely send your care and help out in any way that you can so today you get John and I alone, and I got to pick the subject, which is reporting in analytics. I love talking about this, and I can’t wait to hear all your thoughts, because you’ve done this for a long time. But when I say reporting in analytics, John, What topic do you want to hit most? Yeah, I’m

John Wessel 01:20
excited about this topic, but first we our thoughts and prayers are with those and especially, we’re in the western North Carolina area, which is just 45 minutes from us, yeah. So if you’re listening in that area, our thoughts and prayers are with you, for sure. So in the reporting and analytics that man, that topic, it’s a little bit, it’s a little bit out of vogue. I would say, like, if you look back 10 years ago, reporting analyst, fairly popular job title, data analyst. Now, like you’re saying, like, data engineer, or data scientist or analytics engineer, analytics engineer, of course, yeah, we got Yeah, with DBT coalescing around the corner here, analytics engineer. So I think reporting, in some ways, has taken a back seat from a like, high level view. But practically, almost every business has some kind of reporting going on, sure, and it, I think it’s as far as the topic reporting, analytics, like that whole space, like how you run your business like, in my mind, like that is typically off of reports or some kind of reporting adjacent thing, yep. And then you get into what I might call studies, more like engineering studies or data science type work, that might be how you make your business better. A lot of times I think of that as more of an ad hoc type thing

Eric Dodds 02:38
we need to do it. We should get into definitions for all these. Yes, okay, yeah,

John Wessel 02:42
we’ll definitely get into that.

Eric Dodds 02:44
I’m gonna ask you to define all this. Yeah. Okay, sorry, I interrupted you. What else?

John Wessel 02:48
Yeah, yeah, and that. And then I think just understanding how businesses are thinking internally about these topics, where we have a different definition of roles for a lot of companies, like, we’re talking analytics engineer here, but more so in the reporting and analytics, like, you’ve got these roles, like data scientists, and then you’ve got these practical jobs like reporting, and there’s just this mix up of thing, you know, like, there’s just just, like soup of things, yeah,

Eric Dodds 03:15
all right, definitely getting into those. Okay, the things that I’m interested to pick your brain on one how, like, how possible is it to automate some of this stuff across business models? I think that’s a really interesting topic. There are companies that have tried to do it, so I think that would be fun. And then, if we have time, I want to ask you about the culture around reporting and analytics, because so much of this is not technical, you know, the questions are not technical. It’s actually like, you know, who is using the reports? How are they using them? How are decisions made? Right?

John Wessel 03:48
Communication between the teams, the data team, the business teams, yeah,

Eric Dodds 03:52
Okay, we’ll try to get to all of it. Let’s dig in.

John Wessel 03:55
All right. Sounds good,

Eric Dodds 03:56
John, the experience I had last week that brought this reporting and analytics discussion to light was we were talking with one of our very early stage investors, and he was looking at a bunch of different analytics from a bunch of portfolio companies, reports, actually, which we should talk about, Reports and Analytics. These were reports, okay? And it’s startup companies, you know, and so things can tend to be messy, and that’s fine, but there wasn’t really a lot of consistency across the companies in terms of KPIs that probably should have been, you know, pretty standard across these different companies. And he just made a comment, which is funny, but makes a great point that, you know, everyone’s going crazy over AI, and we still can’t do basic analytics. Which is, which is true. You know, just very few companies even do basic analytics really well. And I think the ones that do advanced analytics are probably pretty rare. Why is that? Do you think I. Yeah,

John Wessel 05:00
I think that’s a good question with the AI thing. Part of the craziness behind the AI is that, like, oh, well, you can do this, like, all that, you know, because reporting in analytics is hard, right? Like, the promise of AI is the promise of easy, right? It’s like, Oh, you do that, but I can make it easier, or, you know, or maybe better. So I think that’s how AI ties into it. But as far as, like, reporting and analytics, so you’re mentioning like, portfolio company here, so I’m assuming, like, this is, like, they’re all similar companies. Maybe they’re all SaaS companies of similar size. Like, yeah, a lot of commonality, right? Yeah, yeah, a lot of commonality, yeah, yeah.

Eric Dodds 05:34
So I mean different stages, but at least enough, at least a baseline threshold to where there should be, you know, some level of, like, B to B, SAS consistent, you know, KPI reporting,

John Wessel 05:46
yeah. So I think, I do think it’s industry it is industry dependent, and I agree that the majority of industries do not have, like, great reporting, but I think there’s that, like, some financial industries, like, there’s some industries where, like, there’s, I think, a really high bar, and they’re, like, very organized and have standards, yep. And I would say probably tending toward financial or, like, more heavily regulated industries, which I don’t have a lot of personal experience with. Like, there are standards, like, if you’re trading stocks, like, there’s a lot of data, and there’s a lot of like, organization around that data, because it’s, like, really important, right? Yep. Whereas, if you are maybe a software startup, or you are a manufacturer. Like, there are less standards around, like, your customer data or your, you know, order data, as far as, like, how should you model it? What do you think about it? And that’s just, you know, I think just the evolution of certain industries like it just, it doesn’t need to be standardized. Therefore it’s not yep,

Eric Dodds 06:40
yep, yeah, it’s interesting. I think go to market models vary so much as well, even among B to B SaaS companies, right? How much of a problem is the reporting itself versus the underlying data? In your experience?

John Wessel 06:56
Yeah, that’s actually pretty, I think, pretty industry dependent as well, for example, SAS and E Comm, I do think has some advantages, is to a higher level of cleanliness of data, because if a lot of your data is click, click stream, like data that, in a sense, like a well, a computer generated it, right? So it’s better, whereas, if you’re in an industry like manufacturing, and like all of your sales orders were entered into a system by a salesperson, like you may have, like, input data problems that are just wrong. We’re just practical, like, I need to, like, clean this data to get it right and do a bunch of feedback back into the system to fix things that are wrong. Whereas in E commerce to e commerce and SaaS, at least if it originated from either clickstream or something that got validated. Like, maybe you have, like, some kind of validated address or something with a like, a Shopify checkout or stripe checkout. Like, at least you can start a little bit cleaner versus, you know, some of the ones where it’s a really bad starting point, yeah, a lot of places, yeah,

Eric Dodds 07:59
That’s super interesting. Okay, I’m, I feel like this is supposed to be a conversation. I’m just asking you all these questions. One so we talked about, and also part of the reason behind these questions is because you, as a consultant, like you work on a lot of basic analytics stuff for your clients, and there are lots of consultants out there whose entire job is helping companies do basic analytics. It’s a huge industry, yeah, definitely even attached to specific tools, right? The ecosystem around support for just getting reporting up in Tableau. I mean, I don’t know how big that industry is, but I would guess that it’s

John Wessel 08:43
huge. Yeah. I mean, let’s think Salesforce, right, like, speaking of Tableau, right? So there are, like, obviously, lots of agencies that implement Salesforce, but I think there’s a number of them too that it’s like Salesforce reporting, right? Like, like, they’re actually, oh, yeah, like, building reports and so yes, or, like, pick your other like, 100% ERP or CRM. And it’s like a lot of what they’re doing is literally just building reports in the specific like, software,

Eric Dodds 09:11
yep. How much of that work? This is the question I was gonna ask you. How much of that work do you think is reinventing or doing the same thing, essentially the same thing, but just like slightly tweaked,

John Wessel 09:23
I think a lot of it. So years ago, I actually worked in a very interesting environment in terms of reporting. So we had a pretty early interface where customers could create their own reports. This was 10 plus years ago. Customers could create their own reports in our system, and I was at the time, was on the database side, like database administration side, and the things people came up with were wild, like the amount of complexity, because we give them a very configurable GUI, but basically didn’t put limits on the GUI where they could basically have infinitely long where clauses select tons of columns. Is, you know, basically joined different entities together, and they came up with some wildly complex things, which then I’m seeing on the back end of, like, who wrote this query? It’s like, well, somebody made it in the user interface, right? It was just, you know, destroying our underlying computer memory resources. So I, I know some software still lets you do some of that, but there’s a lot more guardrails right on that then, like, now that people have learned, like, we can’t just give like, any abilities, like, we have to put some, like, guardrails and restrictions around this with self service stuff. And then, of course, like Tableau and things like that, introduce caching layers. That makes it so, like, all right, yeah, there’s this caching layer in between, like, the database, or you move your database to snowflake or Databricks, and, like, you have a layer in between, but we were actually, like, having basically reporting workload that anybody could make, one that would query production databases. And it was wild and bad, yeah,

Eric Dodds 10:52
yeah. Part of the question is thinking about the way that a lot of consultants, and even in the past that I approached helping with analytics, and a lot of it was standardization, yeah, right, where it can be really helpful. You go into a company, or you go to work at a business, and you say, Okay, we’re like, there are formulas for the way to measure, you know, e-commerce, or B to B, SAS, customer lifetime value, or exactly. And so a lot of times it’s just implementing a framework that has guard rails, like limited scope, discipline, etc, you know. And so I’ve always thought it’s interesting, right? Like, could you essentially report on every e-commerce company in the same way? I understand that obviously, like, there are so many implications and even wildly different business models within e-commerce. But is the reporting different from e-commerce company to e-commerce company? Right? It’s

John Wessel 11:55
not in general, but it is like, specifically, like, there’s tools out there. I think basicity is one, yeah, that tries to tackle this. There’s others for that, for e commerce specifically, yep, but like E commerce, how? Like, there’s so many different flavors of E commerce. There’s e-commerce. Like, I own a coffee shop, and we sell merchant coffee online, but we have multiple physical locations, and we have wholesale channels, and we have partnerships with these, like, consolidator programs. It’s like, so the long tail, yeah. And so I think it’s, I think if you like, Hey, I’m truly, like, a Shopify store, and we sell some things on Amazon too, and that’s like, all we do, Yep, yeah. Like, there’s stuff out there for that, for sure. But like, as you add into, like, multiple channels, and especially specialty channels, like, there’s no integrations, like, there’s no click button integrations into, like, industry specific stuff, yeah, at this point, yep, for sure. And I think back to the standard thing. I don’t know if people just like to make their own life, I don’t even think that’s probably best. But if you, like, Give somebody a framework, like, here’s customer lifetime value, people like, Yeah, but we’re gonna tweak that for this reason. Yeah, and a lot of the reasons aren’t super valid, but Right? People do it anyway. I don’t know how to AI

Eric Dodds 13:12
to Match AI. Yeah, I think that’s probably more what I was thinking of word that tweaks to things like customer lifetime value that aren’t really necessary, but the long tail of differences in, you know, underlying the underlying components of the business, even if, on the surface, you know, let’s say the funnel looks the same. But even where the data comes from, the different relationships you have with other companies, suppliers, different types of customers, et cetera, are really hard, and that kind of makes me wonder if the idea of Automated Analytics is possible. There is a company. I think the company is still around. I think they’re called June, oh, okay, and I think you basically hook up Rudderstack. Or a segment, or some, you know, I’m sure there are other sources as well, right? But essentially some sort of standardized schema, event schema, and they will just generate all of these reports for you automatically, right? So, yeah, retention report, like all these other things as well, from your own custom events and all that stuff that you’ve instrumented. Do you think it’s possible for a service like that to be successful because of all the stuff that you just mentioned, like those meaningful differences?

John Wessel 14:34
I think for a while, I think most of the things like that, like companies, have the right season for them. Like, I even myself, like, did this previously? Like a tweaked version of this? Yeah. So when we first started doing analytics at a previous company, started with, like, an all in like, kind of an all in one, like, hook up your Google and Shopify and various sources ingest everything and. In. Think they maybe even attempted to, kind of, like, model some of the data for you, and then, like, bring your own, like visualization tool, yep. So, like, that’s a version of that. And then there’s another version that is even more, which is, like full GUI, like login. There’s one called Glue, I think is what it was called, yeah, where, like, login, hook up all your stuff, and, like, literally, like, come back a couple days later and, like, all this stuff’s populated just ingress for you. So, yeah, all that’s possible, and it ends up being industry specific. When you have it where you get the value out of it, I think, yeah, like a generic one. I don’t see that working, like, just generic. Like, hey, this will work for all use cases. But if you can really get like, granular, and you get somebody that really gets, like, a certain domain and what’s valuable, and exactly how to, like, calculate, you know, the common metrics, and then really how to present in a compelling way to get people to actually act on them,

Eric Dodds 15:56
you bring up, like, the elephant in the room when it comes to putting in analytics, yeah. Then,

John Wessel 16:01
yeah. I think there’s value in that. But I think, assuming you’re on a growth trajectory, like, I think eventually you, I don’t want to say you grow out of it, but you grow to like, oh, like, we want to tweak that, and we want to tweak that. And then like, yeah. And then, from a product standpoint, the right answer for the company would be to say, No, right? Like this, like, for them to have a good product, like, they need to say no to all the customization, right? So to me, I guess an ideal thing would almost be to like, have the standards, have the models like, like that could be written back into snowflake or wherever you store or Databricks, wherever you store your data, and then allow you also to, like, leverage those models other places when you continue to grow. Yep, that might be a model that works, which

Eric Dodds 16:49
is called DBT. Yeah, you have all the models, you know, based on models and DBT. Well,

John Wessel 16:56
a lot of the providers are kind of going this way, where, like the ETL providers say, hey, we’ll extract the data for you. Oh, and by the way, we have all these baseline, like, DVD models, of course, for you. So then, like, okay, cool, everything’s modeled, yep. And then when you get there, like you, you kind of have a level of opinionation around like, hey, this HubSpot or Salesforce data is modeled. And a lot of times, like the ETL provider will create the model in DBT for you, yeah. So then you have an idea of, like, oh, like, this is how the model should work. Oh, look, this is how they calculated customer lifetime value or whatever. Because the calculations in DBT are so, I think practically, that’s probably the most practical way that this continues to evolve is you have like, community, like managed DBT standards for these common tools, yep. And you may end up with different, like variations of, like, HubSpot or Salesforce as an ecom company, HubSpot or Salesforce as a SaaS company, yep, like that seems practical like that, yeah. And then you take that and you kind of fork it and, like, you maybe make some modifications, yeah, from it, but you try to keep that base intact, yep, as much as you can. Well,

Eric Dodds 18:04
What’s interesting to think about, and a lot of my questions around standard, is standardization were very leading questions, because you already knew probably it’s just an interesting concept, right? Like, can you actually automate that stuff? But if you just look at it’d be interesting to pull the numbers on this. But even just thinking about the size of the BI the market for BI tools, right, versus the packaged analytics tools, right? So even in the product analytics space, there are a number of product analytics tools, right? I mean, amplitude went public, sure, but still, even if you compare the size of the like SaaS based, you you know, point some SDK at this tool, and it can generate a lot of reports for you, the size of that industry is so much smaller, even on face value, than just, you Know, the like, you know, bi and sort of customized reporting, right? It

John Wessel 19:04
is weird, right? Because I get it, in a sense, like, the reason bi so much bigger is because there’s more to show. And, like, practically, like, more people are going to interact with the BI tools. Like an executive may log in and look at a dashboard or download some data. Somebody from marketing, somebody from sales, somebody from ops, like, are all interacting with bi, yeah. When you get more, like niche and, you know, like, just one team interacting with it, then the market’s smaller, yeah,

Eric Dodds 19:32
totally, yeah. And there, I mean, a lot of the product analytics stuff is all event based, right? But yeah, think about a lot of core reporting. It’s combining its aggregates, right that include both, you know, structured and yes and time series data, right?

John Wessel 19:48
And speaking of time series, there are a ton of teams that don’t really deal with time series data, yeah, if you were to do a chart of like, how many people just use, like, six. Non time series, like, like, rows and columns, tables, data versus like time series, I think it’s, I have no idea what the numbers would be, but I wouldn’t be surprised if it’s like 8020 or even like. I agree. Bigger than that as far as like, only like 20% are even dealing with time series data on any significant scale, right?

Eric Dodds 20:18
Yeah, it is interesting to think about that. How many reports leverage a lot of time series data? I would, I think the same. I would, I’d have the same estimate as you, where it’s probably not as much as you think. I mean, in the world of SaaS, we just tend to think that way, because we’re tracking user behavior, or, you know, an E-commerce you’re tracking clickstream, et cetera. But even then, the other interesting part of that is, let’s say, you know, a company has some core reporting that doesn’t include time series data. A lot of times those companies are looking at time series data, but it’s going to be in Google Analytics, for example, right where they look at that completely separately from their core reporting, and they don’t actually include that in, you know, sort of core reporting that would happen in some sort of BI tool or go into some dashboard, right? I think a lot of times, because it’s going from static, structured data and building reports off of that to dealing with time series data and actually getting that data, you know, into a data store running aggregations over. It is a non-trivial step, right? Non-trivial step. It

John Wessel 21:27
is in what I’m thinking like reporting. I mean, accounting, right? It’s a big component of this, and especially in accounting, it’s snapshots like you do not like to change accounting data after it has been finalized, yes, definitely, when you have to, and then it’s generally not good.

Eric Dodds 21:45
I was at the Y the other morning, speaking of that, and this older gentleman had a, what, I believe, I almost went up and asked him, but he had an Enron shirt. Oh, no way. And it, but it was, you know, it was, like, aged enough to, like, he probably, Oh, totally. I was like, this guy worked at Enron, you know, and he’s like, wearing his old Enron shirt, you know, to go work out. Why? Anyway, that’s what happens. I was like, I bet that shirts are worth some money. I bet it is. But yes, you can change accounting after the fact. It doesn’t end well,

John Wessel 22:17
right? Yeah. And, but, and I think people, in most people in data analytics, that don’t have any event stream, like experience, like they think, and it actually they think and static, and it actually drastically simplifies a lot where, like, hey, we have click, we have the streaming data is like, let’s snapshot it. Like, I don’t know, daily, monthly, weekly, like, it’s just easier, right? And sometimes that is the right answer. Sometimes you don’t actually need the Clickstream data. You don’t need the history. Like, you’re just like, ah, like, let’s just snapshot it weekly, or whatever. And same thing with history, right? Like, there’s a lot of static data that, like, changes a little over time. That would be great to have the history, but people are used to just not having it. Yes, yeah. Like, that’s the other component, where it’s not clickstream, but it’s that same, like, series of time, but in a different way, because you’re looking at something in a system. If I were to pull up the screen on Monday versus Tuesday, data in the system is different, and the system won’t tell you what it used to be. People are just used to, oh, like, it used to be that, like, I don’t know. Like, and you just, like, shrug your shoulders, yep. But a lot of like, the analytics tooling now has this, like, like, snowflake holds it, time travel Databricks has been and some of the new standards have, like, the same type thing where it’s like, oh, I can just look back and see what it was Tuesday, yep, and query it. So that’s like, that other time series esque thing, but it’s with the like, yeah, static data, where it’s kind of cool.

Eric Dodds 23:39
I think the other thing with static data and time series data is that a lot of times for core things that you want to report on, even if it’s structured data, you have some sort of timestamp in the structured data that represents some event that’s important. So if you think about a user signing up, or a purchase being made, or something like that. You may not like emitting or observing or logging an actual event payload for that, but there’s a timestamp on the order. There’s a time stamp on, like, the user account created, like, when was that row added to the database? Right? So you can pull that in, and so you have these proxies for events with the time stamps that are on these different objects, right? And a lot of times to report on how your business is doing, that’s all you need. You know you don’t need to. You don’t have to have a specific event payload. Now, when you do capture, like actual time series data, especially behavioral data, you can do really cool things around trending and, I mean, it’s an input to all sorts of things, right? Machine Learning models, I mean, you can do some really cool stuff, but for core reporting, a lot of times you have the proxy with the timestamp,

John Wessel 24:55
yeah, right. And a lot of systems were designed this way, where, like I was called milestone. Or you can call it workflow, but milestones of like, all right, like little, so we’re talking like logistics, like, all right, picks up, yes, shipped, delivered. You know, build interest, like, there’s in the system just built that way where it’s just, like, it’s just static. Literally, call it a column that says, right, shipped at and the date. So I think, and for a lot of depending on your business model, like compressing the time, from like, Let’s go like, order creation to invoicing, like, compressing that time is huge for cash flow for so, like, a lot of analytics and supply chain, manufacturing, e commerce, like those industries, it’s like, okay, cool. How can we, like, compress this time? Yes, yeah. So that’s a super valuable thing for them to like, especially if their system has certain built in milestones. And then you want to get more granular, then it’s like, oh, this is where, like, click stream data could be really valuable, where we can actually, like, have more goal posts. When we drill on, we can be like, oh, so like, there’s this big gap between like, between or like order created and shipped. Like, what went on between here and there could be a million reasons why, like, it didn’t ship. But if you don’t have those things tagged or organized in any way that it’s impossible to root cause of like, what happened? Why did it ship on time? Yep.

Eric Dodds 26:17
Okay, let’s get into definitions. We probably should have started here, but you mentioned this a little bit when we were chatting about, chatting about this topic, reporting is just a term that’s, you know, that’s used, I think, for a lot of different things, right? So, you know, what’s, how robust is our reporting, right? And that could mean a number of different things. So you have Reporting Analytics, dashboards break down the terms for me. Like, how have you used those in the past? Is there a meaningful difference between reporting and analytics?

John Wessel 26:48
Yeah, and I don’t know if I’m sure there are standards, so I will just speak to what I’ve heard practically sure, as far as, like, the differences, yeah, there probably are. There probably are people who feel there’s a Medium post, yeah, about like, for sure, the difference, which is probably helpful with Yeah, yeah, with references to books, like solely written on reporting or analytics. So aside from that, this is just the practical definition here. So I think of us so I think of reports as being so, let’s talk reports analytics. Throw in dashboards, and I’m gonna throw in stuff that I would call, like, a study or projects, or like, okay, you know, a journey. So on the reporting side, I would say what I think of as a report is something that is generated statically. You could have event stream data in a report, but I don’t typically think of it that way, something you’re typically running with a date time range, right? And then to some other inputs, like, I want to run it for this date, and I want to run it for this person, this client, this group of people, this department, whatever. And then, typically, you’re getting some kind of data set and maybe some charts and graphs as part of, like a dashboard. Yep, that’s what I’m typically thinking of. And I would think of this as overlapping circles between reporting and analytics, and that, like, sure, like, I just, like, you could classify the law of that as analytics too. I think it would be fair, yep, for the analytics side, I’m gonna focus on what I’m gonna call more of, like, a project or a study or something. And I think this is super valuable, and there’s like these two camps, like, if you grew up a data analyst working for a department, like marketing or ops or something, versus like you grew up in it and have done like, more, like centralized business intelligence, let’s say I think there’s like two different perspectives here, if you grew up as a data analyst, and let’s worry for ops or something like, for that person, I think they typically do more of what I’m gonna call these, like analytic studies, where they have like, a very specific, like objective of, like, hey, like, this is one came up recently. We have this, we have this business process, and it’s taking forever. Like, we need to analyze, like, why is this thing taking so long? We need to understand what’s going on here. So then the analyst goes in, like, they pull some really messy data, get an Excel, they see, like, Oh, these people are tracking what they did every day, and there’s a tab per day, and it went back two years. So okay,

Eric Dodds 29:39
All right. So like, yeah, we both open those spreadsheets, and I think our listeners have too, and there’s this, like, that feeling in your gut. You know, when you see that it’s just a very specific feeling, yeah, it is a very specific, very specific feeling. So

John Wessel 29:55
then it’s like, Okay, we’re gonna find some way, or we’re gonna write a script and do something to get it. Like, all on, like, tabulated in the one page, and then do some ad hoc analysis. So we see that, like, this is a real story from, like, recently, we see that Wednesday was spelled Wednesday correctly, and then Wednesday not correctly. So then you try to combine it all together, and you’re like, you’ve got two Wednesday columns and then a bunch of other stuff. Because anytime you have, like, 300 sheets and excel, something got tweaked along the way. And, for example, Wednesday may be spelled wrong the whole rest of them because one of those that got copied to the next day, something happened, yep, yep. Got super common. So this is, like, down the road on, like, this kind of project where, like, if you have that IT background, you start like, oh, this has to be repeatable. Like, I gotta write this script so it can handle every use case. Like, if they spell Wednesday, like, five different ways, wrong, I’m gonna handle those and, like, and I think, and if you’re more data analysts, like, we’re all just, like, I just need to get to the end result. Like, I’m not gonna make this like, I’m not thinking about repeatable. I’m just thinking about it, I gotta get to that answer fast, right? Like, where’s the 8020 on this project where we can make a difference? Yep. So I think it people on this like thing get caught up in, like, all the edge cases that you just don’t make sense, and then the animals, yeah,

Eric Dodds 31:09
you want to build a scalable Yeah, you don’t want to be good and scalable. Pay, pay the bill that’s been created, and create a scalable system. Yeah. So,

John Wessel 31:16
but this study, I think this is like analytic studies, what I’m gonna call it. Is it a nice place where you kind of, like, thread the needle to move quickly, to do some of it programmatically, but then pause, be like, Okay, can I, like, drastically decrease my time to result by doing some manual steps? If the answer is yes, take that path. Get to results, like, try to make a business decision, and then, like, bring it back. If this is now becoming a regular thing, bring it back and push it across the next 20% to make it fully automated. Yep. And not be scared of that, right? Because, if you’re from IT background, like, you don’t ever like producing anything fully automated, because you’re because if you do that enough times, then you’re gonna drown, like, you will never, like, recover, sure,

Eric Dodds 32:04
Yeah, how much work do you have to do? Right? Exactly. Yeah, it’s preventing future pain,

John Wessel 32:08
right? Whereas an analyst, like, in your work for the business, like, I don’t know, let’s just hire another analyst. Like, that’s kind of the businesses, a lot of businesses, like, maturity, yeah? So, I don’t know, that’s, like, my one thing where, like, I wouldn’t call that reporting, right, like, that kind of trick study, ad hoc type analysis, yeah, yeah.

Eric Dodds 32:28
I think that’s great . I think it’s a good framework to summarize that. I think that I really like the distinction of reporting being automated reports that are generated for some specific business function, like some or some function, right, right within the organization, not that those things don’t change or people don’t make requests, right? But it’s sort of the core data set that people are using on a day to day, week to week, month to month basis in order to make decisions, right, right? But companies collect, generally, like, collect or have a much wider set of data than what’s in those reports, right? Which would be sort of the universe of analytics, and then exploring, like the relationship between those and those other things they study, that’s a good term.

John Wessel 33:22
I might start you, sounds fancy, like, like, ad hoc analytics or ad hoc reporting, you know, right? Might, but I think study is nice because here’s my, like, a little bit of, like, philosophical tangent here. Yeah. So you’ve got analytics, we’ve been talking about Reporting Analytics, and you’ve got data science, right? So a data scientist. Like, we’re, like, real big on the data part, but the scientist part is, like, I think we miss on the scientist part. Like, what do scientists do? They like, study things, right? Sure. And I think they try, and they try. They experiment over and over again. And most of it is, you know, not most of it. But, I mean, they experiment with a bunch of stuff, and you throw a lot of it away, throw so much of it away. Yeah, right. And I actually think that’s right for what a data scientist should be. But when you apply that into a business, that’s tough and we’re like, this is a different topic, but it’s a fun one. I think that’s a tough sell because, like, think about that. What I just described on that, like, ad hoc thing we, like, with an analyst or a business intelligence, you know, for like, it side, business side, we can come to a result. We can come to a practical situation like, Okay, here’s the 8020 on this. Like, let’s work on improving this. Like, Good job, guys, it. You know, hey, can you automate this? Take this the next 20% we want to see every week. Great. Easy data science version of that is still a study. Like you may want to make a predictive model or something, yep, or do pricing like you added a lot more complexity. You need even higher coordination with the business, in alignment with the business. And then it really is data science, because a lot of. Of stuff is, at least for that particular business, not something they’ve done before or have a framework for. And then you do end up with a lot of throwaway work, and some of it is like, okay, there’s bad coordination, bad communication, bad requirements, but I think some of it’s necessary, and I would like to see people being willing to accept as part of the process, to intentionally iterate quickly through things. And like, iterate all the way through, like, get to a point, and then, like, scrap some of it, maybe scrap all of it occasionally, and keep going. Versus what I often see is we just keep going. And instead of, like, scrapping anything, we just kind of like, pile on top, and we just make it worse until we get to this point where, like, we didn’t want to scrap anything, which was like even worse than it would have been, yeah,

Eric Dodds 35:43
yeah. If that’s really, it’s a good way to, sort of, it’s a good way to frame so much of the exploratory work that needs to be done. Because, again, you have your core reports, right? How much money did we make, how much inventory do we have? You know, all the core things, right? But so much beyond that is doing exploration to determine whether you should actually report on something right, right, or to uncover some specific insight that you need in order to make a decision. But it is. It’s hard to throw stuff away, though. I mean, have you seen that where it’s, like, you know, if you spend a lot of time doing something and you get, you know, like, let’s say you get an answer, but you’ve, you spend a huge amount of time doing it, like, I don’t know. I don’t know, throwing stuff away can be hard. Is that something you’ve seen, it’s super

John Wessel 36:37
hard. I actually, especially with new tech, I see this linear technology that people are implementing, and I started subconsciously doing this, until I actively thought about it. So let’s say, you know, five years ago we set up, set up snowflake, or at my previous company, yep, previously to use redshift, like, would have used the same, like, same methodology. Subconsciously we would like, go through and set it up and and we weren’t, like, super familiar with the tech, and basically reset it up two or three times, not necessarily intentionally at first, but we, but what I often see happen is people don’t want to, like, have to redo anything same thing for like, like, DBT would be a great example, Like, if you’ve never used it before, and, like, somebody sets it up, and you just keep, like, evolving from that initial setup, and you don’t hit like, an early restart button, which is not that bad if you do it, like, a couple months in. Like, yep, it just it gets way more out of control and messy than if you’d like, okay, we’re like, three months in. Like, I actually think we want to restructure a lot of this and, like, okay, we can rebuild this in a couple weeks. Like, yeah. Like, we can take so that, that’s what I started doing, is, when we’re implementing new tech, is we’d have a start to it and then redo it two or sometimes three times. Wow, but you just planned that in with play that, yeah, it was always part of the plan. So we’d start, and if you have three environments, it’s perfect. So, like, all right, we’re gonna build out Dev, and we’re just gonna keep building Dev. We’ll use dev as production, okay, we’re in a spot. All right. We’re gonna build out, like, QA, like, All right, so now, like, so you can kind of do that, yeah, typically for us, like, it wasn’t a big comedy, it was just Dev and prod, but if you do your environments that way, but the temptation is, like, Oh, we got to go environment set up. Let’s, like, set up all three of them. Like, we’re ready. Let’s go. And you don’t have to do that. You can just set up the one, like, iterate on it and like, Okay, we’re ready for this one. Or maybe you just have two, like, Dev, and then three months later, okay, let’s do prod. Let’s, like, move stuff over. Now we have our environments, so the things like that, like, to your point, like, then it feels less bad. We’re like, no. Like, of course we need two different environments. Like, we didn’t waste anything where. But practically you’re starting you are kind of starting over,

Eric Dodds 38:48
yeah, yeah, yeah. That’s super helpful. Because planning the reset is a really helpful concept, because it’s in so many cases, I would argue, a majority of cases, you just keep going down the same path, and then you reach a point where the cost of resetting actually does outweigh the benefits. But then you have long term, just long term pain that you just are gonna live with, right? Because you can’t go back and pay that debt, you know, right? Okay,

John Wessel 39:23
I have,

Eric Dodds 39:24
I have a question that is not related to the technical aspects of this perfect. But you mentioned earlier getting people to care, you know. I mean, this is the classic lament about dashboard graves, graveyards, right? I mean, yeah, all this work is done by analysts to set up these dashboards and this beautiful reporting, and so much of it goes unused, right? How do you begin to think about changing the culture of a company, right? I mean, and I’m not data driven, we’ve talked. Was on the show in the past about, can you be data driven? Cynical data guy commented on that, but it is a good point in that, and I’ll approach it from a very specific standpoint. So when you have discussions within a company, or you need to make decisions, if there is data that can help that process, it’s that’s really powerful, right? Because so many times things move slowly because there are disagreements about what decision you should make, right? And if those decisions are sort of, you know, surrounded by a huge amount of subjectivity, like having some definitive data that everyone can look at and say, oh, okay, like, this changes the way that we think about this, right? Or this changes what I believe to be the true state, it can be incredibly helpful for that, right? It’s not a, you know, it’s not a solution for every situation, like, right? But we’ve both been in meetings where it’s like, oh, you know what? Like, now that we’re really looking at this report, you know, or looking at this data set, like, maybe we do need to reprioritize this project, or whatever it is, right? How do you get to the point? Like, because you’ve been a leader, like, you’ve built this stuff, you’ve worked as the analyst, you’ve been, you know, you’ve had multiple leadership roles. What do you think about that?

John Wessel 41:28
Yeah, I think one like, especially if you’re, like, a data practitioner, just realize it is not about the data. Hmm, it’s not about the data. And that, like, the like, the data analytics, needs to be such a background thing, in a sense that, like, if people are talking about the data analytics, like, you’re doing it wrong, people need to be talking about like, hey, we need to improve cash flow, great data, like, getting the right data, getting everything in the right place. Like, for that, like, can really help, because we can get visibility into where we’ve got cash flow problems, and we can, like, improve it. Great. So that Pro, that conversation should be about cash flow, not about like, reports and data analytics broadly, yeah. And I think if you can get people really focused on, like, hey, if we solve this business problem, or we find, you know, we do a cost analysis, and we realize we’re, like, where, you know, where we can improve in costs, or we do some kind of, like, revenue analysis on, like, you know, we’re doing AB testing on our website to, like, you know, all those business metrics. Like, that’s the focus, and then the data is there to support it. And I would say the biggest cultural thing is, if you can have leaders in those places that can a like, be really good, and if I say places like sales, marketing wherever they can be really good at identifying what the problem is, and then they can like, help quantify like, hey, we want to, you Know, decrease cost or improved revenue or improved cash flow, yep. Like, here’s what I think some drivers for that could be, and then the data team cannot measure it. That’s great. If you’re really excellent on the data team, and you have, like, some domain knowledge to say, maybe you’re a marketer, and like, on the data team, and like, you can, like, be all that on that marketing domain, then you can be part of that driver conversation, right? Of like, what do I think the drivers are for this? That’s great, too. But you have to have the right business conversation. If you don’t have people in the like, functional business roles defining, like, what the problem is or what we want to improve, it’s really tough as a data team to make a difference?

Eric Dodds 43:40
Yeah, that’s really interesting. Have you in the past had to go, we’ve talked about this on the show before, like, actually going as the data, like someone on the data team, like going into the business and figuring those things out, as opposed to, like, waiting for the business to, you know, to tell you what those things are, the majority

John Wessel 44:01
of the time, like I, and I’ve been, I have been in situations where I work with phenomenal, like, business leaders that have great, you know, marketing or sales or whatever, and they, like, knew what they wanted. They knew exactly how, like, how they wanted the team to perform, how they wanted to, like, measure their business. And, man, those are awesome relationships when you don’t have that, like any and you have enough experience, you start overlaying, like, oh, okay, I worked with this phenomenal sales leader. I know what good looks like in sales. And then you can really, like, kind of help coach, maybe a newer sales leader, or somebody in sales that you know, for sales analytics, same for, like, some of these other functional things. And that’s where I think, practically, like, just rockstar data, people are able to, like, have kind of a playbook of like, you’ve worked with enough really good, functional leaders to, like, roughly know what good looks like. And then you can help introduce like, oh, like, what if you measure it this way or that way? Because people don’t typically say no, if you produce some. Things. Like, good, you know what I mean, right? Like, it’s very rare if a leader isn’t quite sure what they want, maybe they’re new, or maybe they, like, just haven’t really done things data driven, if you give it to them, like, hey, like, here’s and it’s good. Like, it’s, oh, that’s not what I wanted. Yeah, I just haven’t, I haven’t run across that a lot, yeah,

Eric Dodds 45:21
yeah. That’s interesting. And, yeah. And sad. Actually, it kind of makes you concerned about, you know, the business leaders is it is that just because a lot of they’re sort of tracking their own stuff in their own way, and is sort of siloed from the rest of the business, I

John Wessel 45:40
do think they’re tracking their own stuff in their own way. I wouldn’t say it’s often siloed. I’d say it’s often they’re tracking their own like non program, like in their through an amalgamation of their impressions with customers, or they talk to their team. They feel like things are going well. Everybody’s like making decisions on something, right? Yeah, but, but if they are doing it, like in a spreadsheet or something, they can usually communicate pretty clearly, like, what they want. If it’s just like, hey, I talked to the team and like, it sounds like, I like to pick on sales. Sales are great, but, oh, it sounds like I talked to these people on the team, and yeah, he’s got a deal that’s about to close, and that’ll be fine. Then we’ve got a couple deals here. Like, I think we’re gonna make it this month. Like, yeah, you can run a sales team that way, yeah. And you can, and that can work for a while, yep, but I think, but long term. But then say you have some salesperson that, like, you know, just you brought in a new salesperson and, like, maybe they’re not the right fit for the role, and you’re not, and you’re just kind of managing based on, like, well, you usually have pretty good salespeople, then it becomes a problem, and then you end up with a couple of those, or you end up with some kind of weird alignment issues of Industry and Market. Like, you’re all of a sudden, oh, like, we’re kind of in a down market now. Like, you just like, those changes are what really throw people off? Right?

Eric Dodds 47:02
Yeah, is that? I mean, one way to describe that is tribal knowledge, right and right? I think that is an interesting way to describe what you’re talking about, because it’s not like, there’s, it’s not like the people who don’t have a clear ask for the data team aren’t good at their job, necessarily, right? No, not necessarily. Because if they’re, you know, if they’re delivering great, well, how do they do that, right? And they, there may just be a lot of tribal knowledge that they have about their function, about the team, about, you know, they have a long history of facing these types of situations, and so their intuition may just be really good, right? Like, oh yeah, I’ve just looked at these numbers, and I pull, you know, it’s like, okay, they log into six different tools, they look at the numbers, and they just have an extremely good sense of because they’ve been through so many cycles and seen the numbers change, and they get the behavior right. And so right, nine times out of 10, they’re probably like, Yeah, I mean, I can, I have a really good sense of that,

John Wessel 48:03
yeah? And they, I think they develop heuristics too, of like, oh, like, yeah. Like, they couldn’t even articulate the exact thing, but they have an idea of, like, Okay, if I talk to this many people, or my team talks to them, well, we’ll close like, X number of deals and get to our goal. And they may not even know that number, but they are kind of heuristic to understand it. Yeah, yeah, totally. And I think the other component here, too is, if you’re in an industry that’s just really stable and predictable that like heuristics and like tribal knowledge, that can work great for a really long time, yes, and then something like covid hits, and it’s like, oh, like, we don’t know what works and what doesn’t work anymore, maybe. Or, like, everything is different, or some kind of technological disruption. Or, like, like, for us, like, even prior to covid, we got hit with a bunch of things. We got hit with some increased tariffs because we were doing importing and like that threw off, like, a bunch of supply chains. Like, do we need to resource stuff? We got hit with covid. We got hit with some other challenges, but if so, I think that’s the challenge, right? Like, if you ended up, say, a lot of people, maybe, like, the last 20 years, ended up in an industry, or even 30 years that’s stable and, like, fairly predictable as far as the sales cycle. Like, you don’t really need the analytics, per se, until maybe you do when everything’s different, right? To dig in and feel like something else, we need to figure out something else, like this isn’t working. What’s the balance

Eric Dodds 49:19
there? And I’ll give you an example. So I used to do a ton of analytics reporting studies around the marketing funnel. And one interesting thing about that was, we had a team who looked at this very, you know, really closely on a weekly basis, right? And when you talk about heuristics. One of the things that was interesting that I noticed over time was, you know, you can start to recognize patterns, right? And even if you don’t have models that describe these patterns specifically, it’s the ability to connect different things. So I’ll just give one example. We did a lot of stuff with SEO and sort of like building a, you know, a couple of directories on the website from an SEO standpoint, and we had some reporting that sort of connected traffic down through to revenue eventually, right? But that’s a pretty long timeline. And so you’re looking for a signal before revenue actually occurs, right? Because you know if you generate traffic, and then it’s months and months, right? But you need to be able to get a sense of where the business is going to land in a time period that’s a lot shorter than the total time. And so you look at it enough, and you start to notice, like, okay, like, if these certain directories behave in this certain way, then I’ve looked at enough. I’ve looked at enough of these things all the way through the funnel, just manually to get a really good sense that it wasn’t worth, like, building really robust reporting, or even trying to model it right at that point, right? But that was incredibly useful knowledge, and there were only, like maybe two people who could speak to that with a really high level of confidence, because they had just looked at it right, repeatedly, in detail and manually followed those threads over and over again for a period of time, right? Right? So back to my question, how do you balance that? Because that’s extremely valuable. In fact, I would say that’s like, Okay, you have someone really good, if they can do that pattern matching and you can, like, trust their intuition around that, it’s like, wow, that’s someone who’s, like, super talented in the near term. A lot of times, it’s not worth it to go through the effort of actually building some sort of model around that, right? But it’s highly risky, because that knowledge lives with someone who, or like covid hits and changes things or whatever, right? So, how do you balance that, like, what do you think about, when do you start to automate some of those things that are more tribal knowledge or heuristic, heuristics that people have developed because they’re just really good at their job, right? And part of their job is looking at things in way more detail, way more consistently than other people. Yeah. I

John Wessel 52:19
mean, it’s definitely difficult. Well, I think one of the things that I’ve found valuable, I really like looking back on it, wish I had done more of it, did some of it. Wish I’d done more of it, is those things that we know that are that, like basic things, like, all right, like E commerce company, like daily sales, or, like, SaaS, like daily conversions, like, you know, daily signups or something, yep. Like, there’s a lot of boring things that, like, you look at a graph and it’s like, pretty much flat, like, I wish I had gone and done more of that boring stuff and put monitors on them, right? Because it’s, it works most of the time, but things break. So at least in like, SaaS and E commerce world, like, I think a lot of the data analytics like, all the focus is on like, let’s mine this out. Let’s figure out like, which costs, like personalization, which customers should we contact? I think there’s a lot of space and just monitoring some really boring stuff of like, let’s just monitor, make sure we get the our average daily signups every single day and like, and we didn’t break the sign up button, there’s like, practical, yeah, with the computers, right? Such a good point, right? And that’s a now, and depending on that, kind of crosses it like, crosses some boundaries sometimes between like, between like, product analytics and maybe like, like a systems monitoring stuff. So, like, there’s some fuzzy boundaries, but in my mind, it’s all data. So I think that would be one thing as far as, like, a focus is, like, focus on monitoring the boring, critical stuff, yep, and making sure it works every time, in some way, like, that’s valuable. And then I think secondarily, then you’re getting into the drivers behind the boring stuff. Like, yes, daily signups. Like, all right, great. Like, we need to hover in this range. Like, yeah, some kind of alert go off if you’re out of range. And then, like, inside that, okay, where are the drivers? And, like, you’re saying, like, SEO traffic, pay traffic, etc, yep. And then that’s the next level down, of like, All right, let’s monitor those, like, roughly in range, what they need to be Yeah,

Eric Dodds 54:20
yeah. I think it’s a great point, because if you think about the people who you know, who you know, they’d have that really good intuition, they’d have those heuristics, almost always, those are the people who say, I think something’s broken. And they Yeah, yeah, because they’re essentially acting as the monitor, right? They’re acting as the monitor well. And I’ve been, like,

John Wessel 54:43
been part of some pretty bad situations, like, especially if it’s a business with, like, high turnover, like, you know, say, business got acquired and then they had a big turnover, or they just had turnover for some other reason. And you end up with a lot of new you, a lot of new leaders at one. Months, like, your heuristic method won’t work right? Yep. And then things break. Or like, it’s especially bad if it’s like, like, e commerce, like you break checkout, or like, yeah, where you don’t build customers right, because somebody was new in accounting, all those types of things. Like you can get, like, if you standardize and get that data in reporting, you can keep lots of consistency if you end up having turnover in key areas, yeah, because at least you’ll know, like, Oh, cool. Like, we have the signups that we normally get, or we have the ref checkout, and then beyond that, like, you’ll figure out, like, common things that happen, right?

Eric Dodds 55:37
That’s such a framework of we talked about the different definitions, you know, reporting, analytics, dashboards, studies, but the concept of observability on some of that stuff, especially like the boring, critical stuff, I think it’s a really helpful concept. I wish that I had looked back, it’s like, man, that really would have been great to treat some of that as an observability problem. I

John Wessel 56:00
I know I did it a few times and was really happy with all the times I did it and didn’t do it enough.

Eric Dodds 56:07
Yeah, yeah. Because the reason I like the concept of observability for those is that context has a very different weight than this is an automated chart on, like, a dashboard, yeah, you know, which, there’s a huge amount of crossover there. When you say observability, right? Generally, you’re, you know, you’re implementing that on stuff where you need some sort of alerting system, yeah, like to it or whatever, yeah. And I just love the different weight that has relative to, you know, some of those core things that are boring, but that also, I mean, it is shocking how many times things can go weeks, and then someone’s like, then it becomes a really big problem, and then it’s a fire drill, and it’s like, man, you know, no one caught this. It’s and

John Wessel 56:56
It happens at every company I’ve ever worked for. And the other bizarre thing is, and the reason like, it’s forefront in my mind, the reason I even have, like, I think part of the reason I have thoughts around this is I did a lot of DevOps work too. Observability for DevOps is like, that’s what you do. Like, that’s crucial. So when, like, we were doing like, AWS Azure type work, like, we instrumented all this stuff, like server monitoring, like disk space and CPU like, all the like, Sure, the basics, and you have alerts, and you have, like, call schedules of, like, this person’s on call, but like, it’s very like, elaborate, like, the amount of data you can get. And then you go and look at BI tools, like, name a BI tool that has a, like, a true first party supported, like, alerting mechanism that’s really robust, like, most of them can kind of do it, but I don’t know of any BI tool that’s, like, treated this as, like, Hey, this is a first party problem that, like, we’re really good at, yeah, yeah. I’ve yet to see one. They’re probably one that exists, but I’ve yet to see Yeah. And one of the like, major Yep, major tools, some

Eric Dodds 57:56
of the packaged, you know, SaaS, like a product analytics tools do have some anomaly detection type things like, which are helpful, but, yeah, again, you can’t capture all this stuff, you know, necessarily all of the boring stuff that you need

John Wessel 58:08
to capture. But it’s fascinating, right? So, like we used, you’ve got, like, New Relic and Datadog and a lot of these DevOps tools for monitoring, but, I mean, it’s elaborate, what you can do to, like, customize exactly how you capture the metrics. It’s an event stream, all stored in the history of it. And then elaborate where you can, like, split things and alert certain teams and like, have it, you know, call like, three people in order if they don’t answer. Like, you know, using like, some of these like services and like, wouldn’t you want that same level of like veracity applied to like, hey, like, you know, we expected like, 1000 sign ups today, and we got two, like, when you want that same level of like, urgency, yeah, yeah. I’ve just never seen it with

Eric Dodds 58:47
Yeah. It is really Yeah. I mean, that is really interesting. One of the like, one of the other things around observability that I love is the like, heightening the awareness of the impact of unforced errors. Someone I used to work with talked about, okay, like, if you know, this is when I was doing a lot of, when I was really deep into a lot of marketing and building the funnel and all that sort of stuff, right? And they, this is really fascinating and such a helpful framework, is that, okay, part of being really good at this is understanding all of the inputs. How are you gonna, you know, hit your number, exceed your number, having a creative like a creative element, to come up with ideas that can, you know, help you sort of achieve this quarter over quarter growth. It’s like, really hard, right? He’s like that, so those things are really key. The other thing is just avoiding unforced errors at all costs. And he’s like, actually, if you can avoid unforced errors, it’s so helpful, right? Because they can, they can regret. Rest of your progress so quickly, right? Like, if you make a really bad unforced error, you know, yeah. And a lot of times it is the stuff that breaks, you know, and that’s something, I mean, we experienced that in a significant way with someone who thought there was some problem on a part of the website that was a really key thing, you know? And I was like, oh, yeah, it kind of seems like a, you know, whatever, right? And then it actually later turned out to be like a huge problem. And, you know, you sort of just recover from what is so painful, right? And it decreases the amount of compounding. So, yeah, I love observability applied to that, yeah, as well, yeah.

John Wessel 1:00:38
And it really applies so well to pretty much everything like, business, unit sales, marketing, accounting, for sure, yep, and, and the other advantage of it is that most people can articulate to you. So if you’re talking, if you’re trying to get like, down definitions of like, customer lifetime value, or even like, sometimes margin like, you can get down some real rabbit holes of like, how do we capture landed cost? And then, like, you know, and then your months down the road, and nobody knows what the landing cost is. But for observability, like, hey, what would be really bad if this thing didn’t work, or, like, you didn’t like accounting, we’ll know that for sure. Sales, like you can get to the end result quickly, like, hey, this needs to be right, or this needs to work, and then, like, the data team can have some real meaningful work to like totally to build out the observability. Yeah,

Eric Dodds 1:01:24
I love it, all right. I didn’t even know how long we’d been recording for, by the way, oh no, I think over an hour. Wow, yeah, so we should probably land here. Yeah, any last parting words of wisdom on recording and the analytics?

John Wessel 1:01:40
I wish I had something. I mean, I guess, I guess, if you’re doing reporting and analytics, I would say one, don’t be afraid to do something more than once, especially if it’s new. Yeah, that’s great that would be one. And then my second one would be when you give yourself that, like, okay, like, we can do it more than once. I think people are willing to take more risks in a positive way, like, hey, let’s try a new tool, or do something a different way. Because I know I’m like, when I get to production, like, it didn’t have to be perfect the first time,

Eric Dodds 1:02:10
Yep, yeah, okay, I’m gonna try and eke out a word of wisdom, or I think, ready for it. This is, you know, I haven’t had a formal role on a data team, but one thing actually, I used to work with someone who they were in an operations role, and so back when I was doing a bunch of marketing stuff, earlier in my career, you a lot of times with marketing reporting, it’s kind of chaotic, and the business is changing and trying to grow and all that sort of stuff, right? And so you tend to go back and reevaluate definitions, right? Is the way that I’m measuring this thing. Is it really reflected in the business or whatever, you know, and then you can create too much complexity, or whatever, but it can also be helpful, right? You’re just trying to figure out ways to describe, you know, what’s happening, right? But I worked at this ops person who was just absolutely incredible. And they really fought me every time I wanted to change something, and we had to agree on it okay, and it was, it was really, I don’t want to say frustrating, but they really forced me to articulate why we needed to change a definition. And it had to be, it had to be a really good reason to change it. And one of their big things, and this in the moment, you don’t really think about this a ton, but they later told me, one of the reasons they dug in so hard was because, even if your definition for something isn’t exactly accurate, if you’re consistent with it over a period of time, yeah, it’s way more meaningful for understanding trends, because all of the variables are consistent for a period of time, right? Because they’re like, it’s way easier for us to measure the wrong definition and then run an analysis on the historical data with that definition changed, then to go back and piece together a story that changed multiple times, right? You know which one you’re trying to know? You know, I just didn’t think about that. So, yeah, if you

John Wessel 1:04:15
want to obfuscate your success, then change the definitions all the time. If you want to obfuscate your failure, then change the definitions all the time.

Eric Dodds 1:04:23
That was a way more succinct way to say, but

John Wessel 1:04:25
it’s true, and I’ve never worked with anybody that did it intentionally. But there were definitely some times where, like, we made so many changes, we’d look back and like, Oh, look at this trend, and be like, Oh no, we changed that. Remember? Like, oh yeah, that’s right. Totally ruins

Eric Dodds 1:04:36
well. And it goes back to the tribal knowledge type thing, right? Where it’s actually hard to where do you centralize those changes, like different people do it, you know, there’s often a lot, like a lack of centralized documentation about, you know, these definitions, and then all of the places in the entire supply chain of analytics, like where the surgery is done to make. Changes and you know. So anyways, that’s, yeah, that was a good that was a great lesson. So yeah, fan of collecting that one, yeah, all right, well, thanks for joining us. We hope you enjoyed it, we hope you enjoyed the chat with just John and I will reach out to us if there’s a subject you want us to tackle. Yeah, we’d love to hear from you. We always love hearing from our listeners. Tons of great shows this fall, and we’ll catch you on the next one. See ya The datastack Show is brought to you by Rudderstack, the warehouse native customer data platform. Rudderstack is purpose built to help data teams turn customer data into competitive advantage. Learn more at rudderstack.com.