00;00;00;00 - 00;00;20;11 Unknown Have you ever felt like you're just constantly trying to catch up with new information? Especially, you know, in the fast paced world of tech and business? It's like you want to be informed. You really crave those moments. But then information overload, it's just a constant battle. It absolutely is. Yeah. And it's not just us as individuals. Businesses. 00;00;20;14 - 00;00;37;25 Unknown Especially in the software as a service, as resource space, have faced this exact same thing. But like on an organizational scale, how do they truly understand what their users need? And you know, how their products are performing, not just superficially but deep down. Yeah, fundamentally. And how do they adapt quickly? That's a really powerful way to frame it. 00;00;37;26 - 00;00;59;28 Unknown Yeah. So how do we move from that feeling that information overwhelm to getting insights you can actually use? That's exactly what we're diving into today. We're going to explore how SAS companies are making this, fundamental transformation in how their software learns and improves, moving from those traditional, often quite slow methods to, well, cutting edge AI driven systems. 00;01;00;06 - 00;01;23;23 Unknown So our mission here is to unpack these really significant shifts happening in SAS analytics, explore the best practices making companies, you know, way more responsive, more customer centric, and give you the essential insights without feeling completely swamped by all the details. Okay, let's get into it. Yeah, let's do it. What's truly fascinating here is that the whole foundation of how SAS products improve has just undergone this radical shift. 00;01;24;00 - 00;01;51;14 Unknown Historically, it was, very human intensive, almost like operating a massive machine with lots of manual levers, you know? Sounds incredibly inefficient. Can you paint a clearer picture? Like, how cumbersome was that old way for product teams? Oh, think about it. Companies would manually collect feedback through things like, NPS, CSat, maybe CDS, surveys. They'd have to wade through tons of customer support tickets, conduct user interviews, meticulously sift through session recordings, analyze heat maps, behavioral analytics tools. 00;01;51;22 - 00;02;16;06 Unknown Then the product teams would spend a mean, literally weeks manual poring over all this data, just trying to spot patterns and pain points. And this led to what you called periodic improvement cycles, where that sounds incredibly slow. It was. Yeah, we're talking weeks or sometimes even months between collecting that initial feedback, analyzing it manually, then implementing changes, and then finally measure the results. 00;02;16;11 - 00;02;36;28 Unknown Often using like AB testing and more surveys. The workflows were largely static. They required constant human intervention male development cycles just to change anything. So if a user had a new need or found a big bug, you had to wait ages for the company to even realize it, let alone fix it and get a new version out. Okay, so if the old way was so slow, what's the big breakthrough? 00;02;36;28 - 00;02;58;19 Unknown What lets companies break free from that cycle? I keep hearing about a genetic AI. What does a genetic actually mean here? That's the key term. Yeah, and it's truly the game changer. Eugenic AI isn't just reactive. It refers to AI systems designed to operate autonomously. Usually with defined goals. They take proactive actions, make decisions independently without needing constant human prompting. 00;02;58;20 - 00;03;26;27 Unknown So instead of waiting for a human to tell them what to do, they sort of observe, learn, and act. And this transforms that slow manual process into these autonomous real time feedback loops. So these eyes are basically self driving improvement engines. Precisely. Yeah they genetic AI enables continuous self-learning. Sometimes people call it a data flywheel. Yeah. Unlike traditional saws that waits for, you know, scheduled surveys or someone to analyze the data, these agents learn from every single interaction instantly. 00;03;27;05 - 00;03;48;03 Unknown And this isn't just passive data collection either. Agents actively seek out new data. They query new sources, collect user feedback, analyze real world outcomes all autonomously. They go way beyond traditional analytics by pulling in external data to like social media trends or what competitors are doing. And you mentioned real time adaptation earlier. That sounds like a huge leap from weeks or months. 00;03;48;09 - 00;04;10;08 Unknown How exactly does that immediate adjustment happen? What's the biggest impact you've seen from that? It's totally transformative. These AI models can adjust their behavior immediately based on outcomes without waiting for the next product release cycle. So this means if a user struggles with the workflow, the I might instantly say reroute them or offer some context specific help, maybe even tweak the product's underlying logic on the fly. 00;04;10;10 - 00;04;31;05 Unknown The most significant impact agility companies can respond to user needs and market shifts almost instantly. That was just unthinkable a few years back. Okay, so this isn't just about faster feedback loops, then it sounds like it's fundamentally rewiring the business itself. What are the key areas where this a genetic AI is really reshaping how SAS companies understand their world? 00;04;31;13 - 00;04;57;10 Unknown You've nailed it. It's fundamentally redefining how businesses operate and interact with customers. The traditional SAS analytics are mostly reactive. They tell you what already happened, but with a genetic AI, it shifts from reactive to proactive analytics. The AI continuously monitors market data, competitor offerings, social media sentiment, even emerging tech to actually predict future trends and identify opportunities before they even become obvious to human teams. 00;04;57;12 - 00;05;23;29 Unknown That's a massive leap from just reporting on the past. And this leads directly to automated improvement then, doesn't it? It does. Yeah. Imagine I analyzing user behavior feature adoption feedback and then not just identifying issues but actually recommending new features or improvements or maybe even making small adjustments to product flows itself autonomously. This just dramatically accelerates the product improvement cycle. 00;05;24;02 - 00;05;45;03 Unknown It lets product teams prioritize development based on actual observed user needs, not just guesswork, and maybe even bypass some of that manual dev work. Okay, so the focus of analytics also shifts moves from being like interface dependent to outcome focused. In an agent like architecture, the core business logic shifts to the AI layer, right, which orchestrates work autonomously. 00;05;45;05 - 00;06;05;05 Unknown So analytics aren't just tracking clicks on a UI anymore, but directly measuring actual business outcomes. This brings up that huge question that's being debated everywhere. Does this mean CES is dead? Yeah, it's definitely one of the hottest debates in enterprise tech right now. The consensus, though, seems to be leaning towards a profound transformation rather than like a complete replacement. 00;06;05;07 - 00;06;28;09 Unknown Many experts think SaaS platforms will remain vital as systems of record, you know, storing workflows, trusted data. While these new agents might take over the user interface layer becoming the main way users interact. So more of a hybrid integration. And we're already seeing this right. Like Salesforce is Einstein AI acting as a copilot within CRM, or Adobe's marketing agents doing AB tests on their own? 00;06;28;13 - 00;06;50;24 Unknown The AI enhances it, doesn't just replace the existing source, makes them smarter, not obsolete. Precisely. That's a good way to put it. However, there's also the other view, famously articulated by Microsoft CEO Satya Nadella, that AI agents could fundamentally disrupt traditional applications by taking over the entire business logic layer, but acting as the main interface to these vast data stores. 00;06;51;01 - 00;07;10;00 Unknown So that's the argument for a more radical shift, where traditional apps become just, you know, data back ends for the AI, right? Well, regardless of how it shakes out, the rapid adoption of genetic AI is undeniable. I saw a projection that by 2025, 25% of enterprises using AI will be piloting a genetic AI, rising to 50% by 2027. 00;07;10;02 - 00;07;34;02 Unknown That's fast. It is fast, and the impact is clear. Companies using these agent CRM systems have seen, on average, something like a 20% increase in customer satisfaction and a 15% reduction in customer churn. Wow. That's a significant ROI, definitely. And it also has profound implications for pricing models that traditional per seat licensing model. It's rapidly giving way to usage or outcome based pricing. 00;07;34;08 - 00;08;02;22 Unknown Companies pay for what the software actually delivers in terms of value. And this also means operational shifts, decision making, moving away from just IT departments to core business functions like finance, customer service. It requires SaaS vendors to become real partners in their clients business goals. So the bottom line is the future of SAS means deep integration of AI agents, outcome based pricing, a bigger focus on industry specific expertise that you know, I loan can't replicate, and maybe a willingness for companies to self disrupt their own models. 00;08;02;27 - 00;08;28;22 Unknown Absolutely. And it's interesting because traditional SAS companies actually have a bit of a head start here. They already have the data ownership, the customer relationships. They're uniquely positioned to evolve. That makes sense. They're sitting on the data. They are. But okay. Beyond the whole agent AI revolution, what is modern SAS analytics look like right now? Incorporating all these best practices because it's gone way beyond just tracking page views or simple website traffic. 00;08;28;25 - 00;08;56;12 Unknown Oh, totally. The new analytics paradigm is all about sophisticated behavioral tracking. We're talking event driven analytics and a huge game changer here has been auto capture technology tools like heap, post hog. They automatically capture all user interactions without needing manual setup first. Apparently this leads to up to 25% more accurate data than old manual methods, which means you can do retroactive analysis capture stuff you might have missed without ever having to instrument it in advance. 00;08;56;14 - 00;09;27;26 Unknown Huge. And with this explosion of data defining a proper event, taxonomy becomes crucial. Like really critical best practice now is structuring events around the product led growth or plg funnel acquisition. Activation. Retention, revenue. Referral. And it's all about standardize naming conventions, consistency across teams, across products. We also see unified user identification becoming standard. Customer data platforms are processing over a trillion events daily, apparently ensuring consistent tracking across complex user journeys. 00;09;28;03 - 00;09;54;10 Unknown Different devices, different touchpoints. Okay, so this unified view feeds into more advanced behavioral analytics methods. Cohort analysis, for instance, that's essential for understanding user behavior over time, isn't it? You often hear about average feature adoption rates like around 24.5% across the industry, maybe 31% for HR tools, 22.6% for fintech insurance. Apparently, companies bringing in $510 million revenue hit the highest rates around 30.4%. 00;09;54;10 - 00;10;19;26 Unknown But beyond just knowing these averages, what's the real insight like if you're a fintech company at 18% adoption, is that a red flag or just normal variation? No, it's absolutely a red flag for strategic focus. The real takeaway is that feature adoption isn't just a vanity metric. It directly correlates with product maturity and crucially, revenue bands. If you're a fintech company below that 22.6% benchmark, your core value proposition might not be resonating. 00;10;19;26 - 00;10;49;09 Unknown Or maybe it's just not discoverable enough. And that's exactly where your analytics should guide immediate intervention to improve that number. It highlights where you need to invest basically. Right. And user journey mapping has also evolved. It's not about just assuming how users will behave anymore is it. Not at all. No. It's about using data driven methods, combining the quantitative with the qualitative insights to understand actual user behavior patterns, identifying those unexpected paths people take, and for product market fit. 00;10;49;09 - 00;11;12;04 Unknown PMF it's move beyond just the old Sean Ellis test. You know, the 40% or more users saying they'd be very disappointed without the product. Now PMF includes things like retention, stability, organic growth, and the depth of feature adoption. Our users not just using the product, but are they truly leveraging its most important features? Okay, so with all this data, these new methods, what are the leading platforms people are using? 00;11;12;04 - 00;11;37;19 Unknown Are there different flavors for different needs? Absolutely. There's quite a range. For example, Post Hoc is really becoming the go to open source solution, especially for teams that are more developer centric. They want full control transparency. It offers a remarkably generous free tier like 1 million events, monthly unlimited users, and it bundles everything. Product analytics, session replay, feature flags AB testing, surveys all in one place. 00;11;37;22 - 00;11;57;02 Unknown Then you have amplitude. That's more. The enterprise powerhouse excels at really sophisticated user journey tracking. Predictive analytics with machine learning great for very large, complex data sets. And what about others in that space? Well, Mixpanel is still a leader for event based analytics. Very intuitive interfaces though it's free tier recently adjusted. I think it's now 1 million events monthly too. 00;11;57;09 - 00;12;20;17 Unknown And then there's static, which is emerging as this unified experimentation and analytics platform. They claim to process over a trillion events daily, with like 99.99% uptime and 50% lower cost than some competitors for things like AB testing. Okay. And beyond those big players, are there any specialized platforms catching your eye maybe for specific needs? Yeah, definitely. We're seeing emerging platforms pop up. 00;12;20;24 - 00;12;46;28 Unknown User maven, for instance, is focused on privacy first analytics using AI for insights. Quantum metric is aimed at enterprise experience analytics, offering automated struggle detection, which pretty neat. And for subscription focused companies, you've got tools like chart Mogul and prophet. Well, they provide specialized source metrics, mirror churn prediction, pricing optimization, very focused. So when you're choosing a platform from this diverse landscape, what are the most crucial things to consider? 00;12;47;01 - 00;13;14;12 Unknown The pricing model alignment is really key. Is it event based? Monthly track. User based, session based? Each has pros and cons depending on how your product is used. And integration capabilities are absolutely vital. Leading tools offer like 450 plus pre-built integrations API first architectures. You need it to fit into your existing stack. The trend is definitely towards consolidation, though companies seem to prefer platforms that can replace multiple tools rather than just adding another one to the pile. 00;13;14;17 - 00;13;40;12 Unknown Makes sense. And as companies collect all this increasingly granular data about user behavior, a crucial question comes up how do they do this responsibly? Because as we're seeing, privacy isn't just a burden anymore, is it? It's actually becoming a clear competitive advantage. It truly is. Yeah. The regulatory landscape is just intensifying everywhere. GDPR enforcement, for example, average fines jump 30% to €2.8 million. 00;13;40;15 - 00;14;03;20 Unknown And remember Metta got that massive €1.2 billion fine for insufficient cross-border data transfer safeguards. In the US, you've got eight new comprehensive state privacy laws coming into effect in 2025. The focus has really shifted. It's not just about simple cookie banners anymore. It's about comprehensive privacy by design, emphasizing data minimization, enhanced user control by default. So this raises a big question for companies. 00;14;03;20 - 00;14;27;16 Unknown How do you actually implement analytics while staying compliant. Are there new approaches emerging? Absolutely. Cookie loss analytics platforms are maturing. Think fathom analytics plausible analytics. They offer GDPR compliant tracking without consent banners, basically by avoiding collecting personally identifiable information PII altogether. That sounds like a significant step forward for user trust. Definitely. And I've heard more about server side tracking lately. 00;14;27;17 - 00;14;49;21 Unknown What's the advantage there? Right. Server side tracking is gaining a lot of traction, mainly because it enhances privacy, compliance and data accuracy by processing data on servers rather than directly from the client's browser. Companies can apparently achieve up to 25% better data quality compared to client side. Plus, you get improved consent enforcement and it's more resistant to ad blockers. 00;14;49;24 - 00;15;14;05 Unknown And then integrating consent management tools like one Trust or Google Consent Mode v2 allows for those granular consent options compliant data collection, letting users really decide what they share. And on the real cutting edge, what about advanced privacy preserving techniques? Are those practical yet they're getting there techniques like differential privacy, which Google and Apple are already deploying for statistical analysis, are becoming practical. 00;15;14;08 - 00;15;42;24 Unknown They let you get insights while mathematically protecting individual privacy, and data minimization strategies are essential to just collecting only necessary data. Having rolling data retention policies using progressive consent collection, it's all about building trust from the ground up. So ultimately responsible tracking means getting informed consent, anonymizing or aggregating data, handling it securely with strict access control, and avoiding that creepy tracking. 00;15;42;24 - 00;16;04;04 Unknown Right, like making sure personal content is masked in session replays. Exactly. Privacy isn't just about avoiding fines anymore. It builds user trust. It's a clear competitive advantage. Okay, let's shift gears slightly. Moving to the underlying technical architecture, all these shifts we've discussed, they require some thoughtful decisions to make sure everything works and scales right, especially for a growing source company. 00;16;04;05 - 00;16;26;03 Unknown Absolutely. The plumbing underneath matters a lot. So when we talk about the tech underneath, you mentioned hybrid architectures. What exactly does that mean for how data moves? What kind of tools are we talking about? Essentially, yeah, it's about combining two speeds of data processing. You have real time streaming using tools like Apache Kafka or maybe Flink to give you immediate updates on what's happening right now. 00;16;26;07 - 00;16;50;05 Unknown That's crucial for those instant feedback loops. Then you combine that with batch processing using systems like Apache Spark or Cloud data warehouses for the deeper, broader historical analysis. This lets companies have both that instant responsiveness and a comprehensive view of everything that's ever happened. Best of both worlds. Sort of. And when it comes to actually implementing this, is server side always the way to go or is it more of a mix? 00;16;50;08 - 00;17;16;22 Unknown It's often a mix. Server side tracking definitely offers better data accuracy, privacy, compliance, resistance to ad blockers, but it can add some complexity. So the optimal approach is often a combination client side for immediate user interactions. Events like clicks, you know, and then server side for maybe revenue events or more sensitive tracking. API first architectures also help here supporting flexible, scalable data delivery. 00;17;16;22 - 00;17;38;11 Unknown Letting systems talk to each other smoothly and performance optimization must be critical. You're collecting so much data you can't let your analytics slow down the actual product. Exactly. That's a huge no no. Efficient data collection through things like event batching, query optimization with proper indexing and caching, and infrastructure scaling using cloud native architectures are all pretty standard practice. 00;17;38;11 - 00;17;59;04 Unknown Now for growing SaaS companies. You just can't have insights at the expense of the user experience. Okay, now let's connect all of this to product led growth or Plg, which has its own specific ways of measuring things. What are the key metrics here that stand out? Right. This is where it gets really actionable for those plg companies. Product qualified leads or Pcrs are super critical. 00;17;59;07 - 00;18;20;26 Unknown These are users who show meaningful engagement with the product, indicating they're actually likely to convert. Not just anyone who's signed up industry benchmark suggests that somewhere between 20 to 40% of free trial users should probably become peaks. It's about measuring real intent. And the Plg flywheel model replaces those old linear funnels, right? What does that look like in practice? 00;18;20;27 - 00;18;41;13 Unknown Yeah, exactly. It replaces linear funnels with these circular frameworks. It tracks user progression from being a complete stranger. All the way to becoming a champion. Someone who not only uses your product but actively advocates for it. And another key concept is natural rate of growth or Kng. This measures you organic growth without any paid marketing or sales intervention. 00;18;41;15 - 00;19;02;14 Unknown Leading plg companies often achieve 20% or more energy. It indicates really strong product, market fit and maybe even some viral potential. So when we look specifically at Plg, what are the most crucial, most actionable metrics that really jump out to you? The ones you absolutely have to track. Okay, well, retention rate is foundational. Obviously it directly shows product stickiness. 00;19;02;17 - 00;19;25;17 Unknown And cohort analysis is crucial here to see how different groups of new users are retaining over time, daily and monthly active users their umu and their stickiness ratio. Also gauge overall engagement and growth. For example, a day you may ratio of 50% means your average users active 15 days a month. That's a pretty strong indicator of getting value and feature adoption rate. 00;19;25;17 - 00;19;51;29 Unknown That seems huge too. It truly is. It's the percentage of users who use a particular feature out of everyone who could potentially use it. If adoption is low, it points towards maybe discoverability issues, or perhaps a lack of perceived value in that feature. There's that famous example from buffer, right? They discovered their emoji picker, which wasn't even a core feature, was wildly popular just by looking at click tracking data that led them to invest more in that little delight or feature, because their analytics showed it actually matter a lot to users. 00;19;52;02 - 00;20;17;13 Unknown Yeah, yeah. Great example. Then we have conversion rates and funnel metrics, which obviously identify the biggest drop offs in user journeys, like from free trial to paid crucial stuff and time to value TV. How long it takes a new user to actually get the core value from the product. Best in class Plg companies apparently maintain activation rates of 2,040%, and one of my favorites rage clicks and error signals. 00;20;17;17 - 00;20;36;07 Unknown Quickly. We've all been there, right? Frantically clicking a button that doesn't work or hitting some kind of dead end. It feels like the app is just mocking you. Exactly. And for product teams, those frustrated clicks are like pure gold. They're direct, unfiltered feedback about a broken user experience, often way more honest than a survey response could ever be. 00;20;36;14 - 00;21;03;01 Unknown Fixing those isn't just about polish, it directly impacts retention and satisfaction. These are critical UX frustration signals. They show design or functionality problems that absolutely need fixing. Totally agree. And finally, for expansion revenue optimization getting more revenue from existing customers. Companies are using customer health scoring. These are predictive models for churn and expansion opportunities. Integrating usage data. Engagement levels support interactions, all of it. 00;21;03;03 - 00;21;26;07 Unknown Tracking specific expansion indicators is also key. Like our users hitting usage limits, exploring premium features, inviting colleagues, adopting advanced use cases. Companies that achieve 30% or more expansion revenue typically track these things really closely. It's all about spotting those growth signals within your existing customer base. Okay, so that's a lot of metrics and methods. How do we actually make all of this happen? 00;21;26;07 - 00;21;47;26 Unknown How do we truly integrate analytics into continuous improvement workflows. So it's not just about collecting data. Yeah that's the critical step. It really starts with fostering a data driven culture. And that has to be championed by leadership. Data needs to be visible accessible to all teams, not just locked away with analysts fostering a shared understanding of the key KPIs across the company. 00;21;47;28 - 00;22;14;20 Unknown It's about empowering everyone to see and understand the story the data is telling. That makes a lot of sense, and it means integrating analytics into the day to day, like agile rituals, right? Precisely. Yeah, yeah. Incorporating insights into sprint planning, daily stand ups, maybe OKR processes. It helps prioritize work based on actual user impact. So for example, using data to prioritize fixing a bug on the mobile checkout flow because analytics show high drop offs there. 00;22;14;22 - 00;22;39;13 Unknown It stops analytics being an afterthought and makes it central to decision making. What gets work done next? And running experiments. Closing the loop that seems essential to you can't just make a change and hope for the best. No, you definitely can't. Using things like feature flagging and AB testing frameworks, maybe PII prioritization, impact EAS or RF reach impact confidence helps test changes systematically. 00;22;39;15 - 00;23;01;12 Unknown You measure the results, learn from the outcomes, good or bad. You're constantly iterating and improving based on real user behavior. And ultimately, it comes down to connecting those product metrics to actual business outcomes. If you can't show how product improvements lead to tangible financial results, it's probably pretty hard to get executive buy in for more resources. Absolutely. Linking user metrics. 00;23;01;12 - 00;23;28;00 Unknown Adoption, engagement, retention to the larger KPIs like revenue and churn is non-negotiable. When product teams can clearly show, hey, feature adoption increased 20% among this cohort, and as a direct result, expansion revenue grew by 6% in those accounts. That's undeniable. That's how you demonstrate real value and secure further investment. And maybe most importantly, it's about being willing to iterate and embrace surprises, right? 00;23;28;07 - 00;23;50;04 Unknown Data will challenge your assumptions. That unexpected pattern or that niche feature users suddenly love, like the buffer emoji picker that can reveal a new opportunity you never even anticipated. Exactly. And if a metric drops after a release, don't panic. Treat it as a learning opportunity. Investigate what went wrong. Iterate on a fix quickly. It's all just part of that continuous learning cycle. 00;23;50;05 - 00;24;13;22 Unknown This really is a fundamental shift in how software learns and improves itself. It truly changes the whole relationship companies have with their product and ultimately with their users. So to kind of recap our deep dive here. We've seen this fundamental shift in SAS analytics towards really sophisticated privacy compliant, often AI powered behavioral tracking systems. And these systems are driving both product improvement and business growth. 00;24;13;25 - 00;24;37;23 Unknown Mastering these shifts means treating analytics, not as some separate function, but as an integrated capability, informing every aspect of product development, customer experience, and business strategy. Absolutely. It's about creating that virtuous cycle. Deploy changes, measure their impact, learn from it, and then deliver an even better product experience next time. This in turn drives higher adoption, better retention, more customer satisfaction. 00;24;37;25 - 00;24;58;26 Unknown And that becomes a key competitive advantage, especially in a crowded market. So in a world with just more and more information coming at us, the future of competitive advantage isn't just about having data, it's about creating systems that learn, adapt and improve themselves continuously delivering more value, making the products we use smarter, more intuitive, and ultimately more aligned with what we truly need. 00;24;58;29 - 00;25;07;20 Unknown So maybe the final thought for you, the listener, is how will you apply this understanding to the systems you interact with every day, and perhaps even to your own learning?