A discussion about experimenting with data-driven onboarding to boost customer experience during trials and drive business outcomes.

TechSmith's Snagit Strategy Lead, Daniel Foster, shares how he leveraged product usage data, segmentation, and targeted onboarding messaging to improve trial users' experiences. He discusses his experimentation with onboarding communications to lift trial conversions and cover:

  • Developing a strategy around hypotheses to test
  • Collecting data to identify users' stated "job to be done" with the application
  • Creating user segmentation and testing in-product messaging to improve the user experience during the trial
  • Analyzing the impact of improving customer experience on trial conversion rates

You might also like…

Thanks

Transcript

Louise S. (00:00:02) - Hello everybody and welcome to today's webinar, turning Trial Users Into Successful Customers. And it's with Revenera and our guest from TechSmith. I'm your host, Louise Stebbings. I work in marketing here at Revenera. And before I introduce you to our speakers, a little bit of housekeeping to make sure you can enjoy the webinar properly. I'm just sending you a little thumbs up, thumbs down, symbol onto your screen now. Um, if you can hear me and see everything, okay, please just press the, the, the green thumbs up symbol, uh, and then I know that, that you're okay and you're on board. And while you're doing that, I'll let you know as well that you can, uh, move the screens, uh, the boxes around on your screen, uh, minimize, maximize the slide area, whatever you want to do. Today's webinar will last for about 30 minutes.

Louise S. (00:00:44) - Um, and if you have any questions as we're going through, pop them into the questions box and we'll do our best to get to those, um, as we can throughout the webinar. Um, and at the end as well. Um, just so you know, some webinars, some networks really cause the, the slides to advance a little more slowly than others. So, um, we recommend logging off your VPN if you, if you're noticing an issue like that. If you do notice that what we are saying is a little slow or it seems out to sync with any slides that we're showing, press F5 on your keyboard and that should, uh, bring everything back up to speed again. Um, and then finally, for any other problems you might experience, any technical problems, there's a question mark symbol at the bottom of your screen. If you click on that, there's a technical FAQ there.

Louise S. (00:01:24) - So you'll receive a, a copy of this recording within the next 48 hours as well. So, um, if you are taking notes or if you want to watch anything again, you can do that, then that's no problem. So our speakers today are, um, TechSmith's Snagit, strategy lead, Daniel Foster and Vic DeMarines, who's VP of product management for software monetization here at Revenera . Um, and Dan and Vic will be discussing, experimenting with data driven onboarding, um, to boost customer experience during trials and to drive business outcomes. And your moderator for today's webinar is Michael Goff. He's the principal of product marketing here at Revenera. So with that, I will say over to you, Michael.

Michael G. (00:01:59) - Thanks Louise. So, uh, welcome everybody. I'm very excited, uh, for everyone to be here today and especially to have Daniel with us on board. Um, this is something a little bit different. It's not gonna be your typical webinar. We're gonna make very little use of slides. I think it's really meant to be more of a conversation. Um, as Louise mentioned, if you have questions as we talk, certainly put those into the, uh, Q&A box and we'll see what we can do. Um, but again, the whole point of this is turning trial users into successful customers, right? So, the focus is on how do you get successful customers First you have to turn users into customers. So, Daniel, maybe you can start off by telling us a little bit about, you know, what's the culture of testing and experimentation like at TechSmith?

Daniel F. (00:02:43) - Yeah, for sure. So first off, just, you know, if folks aren't familiar with the products, um, Snagit, Camtasia are the most known, um, TechSmith products. Um, we've been in business for, you know, over 30 years and started out in freeware actually as our model. And, uh, but for years and years we've had a trial model. And so trial is great. You know, 30-day trial is, is pretty typical for our products. Um, and I work a lot with Snagit, which has a basically a 15-day trial, plus you can extend it to get 15 more. So, um, there's a lot of room for experimentation. Uh, a lot of things we do or had historically done in marketing happened before the download. So, you know, this kind of ad versus that kind of ad optimizing the funnel. Um, one of the areas that I really coveted being able to experiment more was, you know, how do we do things during the trial?

Daniel F. (00:03:34) - And, you know, the, the only surface is not just inside the product, right? Because we sent emails. So when you start a trial, you're gonna get like kind of a drip nurture series to help you identify, um, different functionality and use cases that should connect with you. So that was already going and had been for years, but what happened in the trial was pretty locked in, right? And it's tough to run an experiment when you can only push a new version of it every time you release an update to the product, you know, which might be, years ago it was like six months sometimes between, you know, versions, uh, or, or updates. Now we do it a lot more frequently, but, but still it's not that really, you know, quick speed to, to have experimentation, that you want. And so, um, one of the things that we've been able to do, uh, through having the usage analytics is, uh, use that especially with ReachOut.

Daniel F. (00:04:27) - So the ReachOut functionality within Revenera to be able to push messages into the product and have those appear to users while they're using Snagit during the trial specifically. And, because we can make those intelligently, um, you know, contingent on something about you or how you're using the product, we can tailor that and we can do faster experimentation to sort of try some messages and see if we can deflect usage and ultimately deflect, um, or, you know, increase, uh, conversion. And, and that's simply by helping people connect with that value more quickly.

Michael G. (00:05:04) - Yeah. So yeah, when you're talking about sending specific messages to specific people, that's obviously based on segmentation, based on thresholds they've reached in usage or other things, Vic, now may actually be a good time for you just to level set for everybody and give a a quick overview of what software usage analytics is all about.

Vic D. (00:05:21) - Yeah, so just a quick introduction. So Daniel mentioned, for example, ReachOut. Uh, really that's functionality, it's in-app messaging functionality and that might be pretty familiar to almost everyone when you use, when you go to a website, you see certainly a lot of this in-app experience, but when we talk about on-premise applications, much more difficult to do. So, um, really the usage intelligence platform includes the key functionality of tracking the key events within your application, really peeling the onion to try to understand in an anonymous way, um, how people are interacting with the application, what features are being used, what aren't being used, uh, certain meta information about those features that you want to collect. So you as the developer, the software producer can have a better experience and obviously deliver, you know, the right, right value to that, to that end user and ReachOut. Or the in-app messaging is just how do I act on that data now once I learn how these features are being adopted, how do I change the behavior or make that experience better? And, you know, having, uh, sort of what you see on, websites and SaaS applications with maybe, maybe too much messaging, but in an having that in-app messaging delivered within the context of someone using an application that they're getting value is certainly key. Certainly a really important thing.

Michael G. (00:06:43) - Thanks Vic. So Daniel, obviously at the heart of this, you know, it's improving the customer experience. Um, you guys have done some really good experiments with, uh, user onboarding. Um, how important is it, uh, during the onboarding process to shorten that time to value? So you make sure that you're improving the experience for trial users?

Daniel F. (00:07:03) - So, you know, I've, I've talked about this, uh, concept a little bit to a few different audiences. Um, some audiences are people who are actually making the content that are helping to make their customers successful, right? And, and one of the analogies I've used is this, this visual of a bridge. And, and when I started working, I, I moved from marketing over to product and, and working with our engineers, uh, over the years at TechSmith. And when I made that transition, one of the things I, I just wanted to keep kind of evangelizing and telling them is, you can ship features, but unless someone has connected with the value of those features, right, and the use cases around them, you've, you've built half a bridge and it's just as useful as half a bridge because you know, you haven't actually delivered the promise of the product to them.

Daniel F. (00:07:46) - And so, uh, the, the idea of time to value is not unique to me. I borrowed it from other people, but it is really this, you know, and I I kind of drew this, uh, very simple graph, but it's, it's saying, you know, over the time of, let's say a trial, um, period, how quickly can you kind of connect people with the things that the product can do for them so that you move them above the line where, you know, there's thumb threshold where they're like, okay, this is, this is gonna do it for me, right? I'm gonna adopt this. Um, maybe it's, I start even using the trial, like, you know, invest some time. Maybe it's invest in actually purchasing a license. Either way there's an investment, right? So how do you move people more quickly through there? Um, the long game and, and I kind of made this, you know, uh, as, as the green line here, uh, you know, this, this top line is like, how do you, how do you keep leveling up? So it's not just, you know, how do I move you above the line a little faster? But over time, as you adopt, as you, as I retain you, you know, as you continue to find value, you'll be exposed to new use cases and new functionality and, and kind of continue to grow.

Daniel F. (00:08:54) - So that's, that's the overall, um, concept.

Michael G. (00:08:57) - Yep. And, and I know that, you know, you've identified what you call jobs to be done, um, to sort of help move people along through that trial process. You know, how did you identify what those jobs to be done should be? Was it guesswork or were you doing something else to really hone in on where you're gonna deliver the most value to customers and trial users?

Daniel F. (00:09:17) - Yeah, so, um, just to make sure we're all on the same page about what jobs to be done are, because this term gets used a lot and frankly, like there are people that debate and argue about how to use it. And I'm just like, whatever. Um, the part that's most useful I think is just this concept that you're, you're, your potential customers are there to hire your product to do something for them. Some people get, you know, very like esoteric about like, you know, your intentions and what it ultimately does for you in your life. It's like, okay, maybe, but, but in the moment you're like here to try this product or maybe evaluate adopting this product because you have something you're trying to get done, right? Something you're trying to accomplish. So, um, honestly, when we started this process, I didn't want to assume because our, our product SnagIt is very horizontal.

Daniel F. (00:10:03) - It's adopted by people in every role imaginable across every industry. And so it felt really overwhelming. So one of the sources that we actually went to was reviews. And, um, G2 Crowd has a lot of reviews. There are other sites you can get reviews, but the point was, how do we see how people, what do they say when they're asked the question just organically, what business value? Are you driving from this product? What are you wanting it to do for you? And so we collected up, um, a lot of those sort of organic reviews, categorized them. And uh, what we ultimately came up with, uh, was a, was a list, you know, kind of a refined list. And it, I, I won't go through all the details of, of the process of refining it. Feel free to, you know, grab my contact info and ask me if, if someone's really interested in that.

Daniel F. (00:10:49) - Um, but you know, we, we, we took those myriad things that people said they were trying to do, and we boiled it down to like these, this list of nine. So one of the things we actually started doing was presenting that to a new trial user as a form, uh, when they started the trial and saying, if you tell us a little bit about what you're trying to accomplish in snag it in exchange, we will help provide some onboarding, some, you know, coaching that will get you connected with that as quickly as possible. And then just, not to get super technical, but this lives in a database, you know, the response to that lives in a database that gets queried by our product through, uh, Revenera and then pulled in as a value as a custom property, if you're familiar with usage analytics, um, that we can then use to target, uh, and focus our messages. So if somebody says, for example, I'm trying to create, um, documentation or technical guides, well then certain features are super important for you and other features can wait, you know, it's fine if you discover that later, but it's probably not hour one day one. So we've been able to really focus that introduction of the, of the product around what job they're trying to accomplish,

Vic D. (00:12:02) - Daniel. So this is a pretty cool process. I, I guess, uh, and I'm familiar with Snagit and Camtasia as well, um, and there's certainly a business to consumer aspect to it, but there's also business to business. I'm just trying to think how does this translate to maybe a computer aided design product or a more scientific application? And I guess it could translate, it's all, it's when you do the jobs to be done, you just gotta aggregate to a point, right? And even a very complex product.

Daniel F. (00:12:30) - Yeah. And I think a, a product that has a pretty homogenous user base, it's gonna be different, right? It's not like all over the map. Um, it might be two levels or something. You know, you might have, um, 1 lens you could think about might be, are you a power user of CAD software in general? Are, are you new to CAD software, right? And so if I'm, if I need to introduce you to CAD software and a lot of the concepts of CAD software, that's gonna be a different set of tasks than if I just need to introduce you to our flavor of CAD software and, and some of the specific things. So yeah, I think that's gonna vary, you know, and, and evolve over time. Like we are now evaluating like, okay, this has been really useful as one lens. If we were to ask another question or try to collect one more data point, what would it be? Would it be industry is most valuable or would it be your level of familiarity with Snagit would be most valuable? And then how would we actually use that to tailor your experience?

Vic D. (00:13:29) - So to your, your database and to the responses. So if you see a user on a particular use case, they're doing, like you said, the tech guide and the usage analytics will certainly, it'll, so you can use those as custom properties, the segment, the data, and if you see that they're not hitting functionality, do you then, hey, that's an opportunity to guide them a bit more. Or, um, you know, hey, that's it, it's in disagreement to what you wanted to do. The, the fact that you're not seeing this new capability added, do, do you go to that level in terms of, uh, guiding them, I guess, or messaging them?

Daniel F. (00:14:03) - Yeah, I mean we, we kind of try to have these guiding principles cuz it's always a challenge to know like, how much is too much, how pushy can you be? And, and you know, because you're, you're ultimately trying to help the user, but sometimes it doesn't feel like help, right? Yeah. I mean we all remember clippy and it's sort of just a, a tired, you know, like whipping boy. But it's, it's like that was, you know, that was not helpful for a lot of people. Um, but you know, this, so these are some of the principles we try to keep in mind here. You know, we want it to be whatever we're doing, we want it to be relevant. Um, it's better to be focused on a few things rather than a lot of things. And how do we make it timely? So we do do some filtering to say, um, you fall into this segment and you have not already discovered that feature, therefore, you know, we will, we'll show you this part, we'll

Vic D. (00:14:48) - Do it. Okay, some rules

Daniel F. (00:14:49) - You discover that feature, we might not.

Vic D. (00:14:52) - Yeah, that makes a lot of sense. Versus like going to a car dealer website where as soon as you hit the home, the homepage, there's 15 things popping up on you. We want, you know what I mean, where it's, you know, noisy.

Daniel F. (00:15:03) - Yeah.

Michael G. (00:15:05) - Well, and I, I do think it's interesting, Daniel, that, you know, you guys started with sort of anecdotal subjective feedback that you're gathering from those review sites. So I think that's a actually a really smart way of honing down your list and then pulling it into the application, presenting it, and then measuring it. But so from that list of jobs to be done, you know, what are some of the things you've learned in terms of, you know, improving the customer experience and its impact on the trial conversions?

Daniel F. (00:15:32) - Yeah, so, um, one of the refinement steps that we took was actually, um, partly it was to validate that list of jobs to be done and make sure that there were some differences between them. Cuz if most of them were really the same in how you use the product, then you know, let's not treat them differently. So we did use usage analytics there, um, because we were able to pull reports and say, um, look at what features, you know, if, if, if someone said their job to be done was, you know, I'm making marketing content, how, how does their usage of the product differ, um, than someone who says I'm using it to make technical documentation, for example, right? And so, um, there's a lot of things that are gonna be the same, you know, there's kind of this like 60% or something that's like, yeah, pretty much the same.

Daniel F. (00:16:19) - But then as you move out from that, you start to find some differentiation, right? Where oh, okay, they, they push, uh, they bring in content in different ways into Snagit. Um, some of the editing tools they use are gonna be a little bit different. They're gonna be more likely to use this one than average. Uh, and they're going to share the content to somewhat different destinations because, you know, they're in marketing, some of the stuff is intuitive, some things were a surprise to us, and we learned, you know, um, we learned that people making training content, were definitely using the video functionality in Snagit more. And so that's really helpful because we kind of have these big, you know, it's an image editor, it's also a simple video creator and you can make gifts, right? So which of those should we talk to you about first?

Daniel F. (00:17:03) - So yeah, that has helped us, like if you say that you're in kind of this training space, we need to kind of make sure that you are aware of this whole set of functionality around video creation. Yeah. And a lot of that came through those proof points of seeing, oh, people actually do use the product differently depending on which segment they fall into. Yeah. Another lens we applied that's outside of the usage analytics, um, is preference. And we did that through surveying. So I like to say that, you know, um, analytics tells you what, it doesn't always tell you why, right? And, and that's just true of any analytics you have. It's like, it's describing what's happening, but often you're like having to fill in the gaps about why. Um, and so we even have, we have usage analytics of our, our web content to know, you know, people really like to watch video a lot or whatever.

Daniel F. (00:17:54) - But we didn't just assume that was true. We actually went and surveyed customers and said, when you start to use a new technology, whether it's ours or whether it's another piece of new technology, um, what formats kind of, where do you go first? And so this was, this was some of the research on that. And, and honestly, you know, video is just super dominant as the most helpful format that people, um, go to. And often when they, where do they find it? They actually go to Google. So they start with Google, whatever your question is. What comes up is a lot of results. Many of them are, you know, embedded video clips. Um, and that's, and that's how they connect. So knowing that helped us then say, all right, we know kind of what the topics are that we want to try to get in front of people and what is the format of that content that we should prefer. So if we have a video about that topic, let's show them that, because that seems to really align with people's preference for how, you know, how they want to ingest the content.

Vic D. (00:18:51) - So Daniel, we were talking about

Daniel F. (00:18:53) - Our tools are tools for making videos and, and you know, and visual content. So it also is like good we can show off how we're using our own tools to our customers too.

Vic D. (00:19:03) - Yeah, we were talking about that a little bit and how powerful YouTube was. And, and, and certainly as a working for multiple product companies, we have our YouTube content and I'm, I'm sure you, you know, TechSmith pushes out their content, but do you actually have programs where you get, you know, other parties, you know, evangelists, uh, you know, product power users to say promote them to put their own out there? Because I think people want to hear also from additional users, their peers. Is that something that you had done as well?

Daniel F. (00:19:32) - Yeah, yeah, we are. Um, some of that has just happened organically over the years. We've never discouraged it. You know, sometimes people are like, am I allowed? I'm like, yeah, please, you know, , uh, do more. Um, and then, you know, so we've been loose with like, you know, if you're using our interface, it's not like we're gonna come after you or, so, you know, people kind of have that fear. Yeah. Yeah. Um, and then, and then I think more recently we've been proactively reaching out to people who kind of already, um, have an, a following or, you know, some influence in a space and say, you're already talking about our product. How could we, you know, make this more efficient and for you and kind of support you in doing that? And I think there is a lot of credibility. I will also say one of the interesting findings is, um, you know, I'm a former marketer, that was where I started.

Daniel F. (00:20:17) - People tend to trust content that comes from the, the user assistance or the customer education is kind of a buzzword people are using now, but the customer education side of the house even more than they trust the marketing content. So, and you've done this, right? You're like investing in a product and you're like, okay, I've looked at the marketing video, it seems to be like it'll do what I need it to do. Now I'm gonna go watch a couple tutorials and see if I have confidence that it actually will do it right. And if I actually can make it do that with my skillset, you know? So I think there is a, a great opportunity there for the whole, you know, that whole customer life cycle for this kind of content to be injected and be useful to people.

Michael G. (00:20:59) - So Daniel, have you actually measured sort of the impact of people interacting with your content and how that impacts, um, their conversion rates?

Daniel F. (00:21:08) - Yeah, so one of, one of the, you know, back, yeah, back to the experimentation topic. Um, one of the sets of experiments that we ran, um, was, well, let me show you first what a banner looks like inside of, uh, inside of SnagIt, just cuz it's a little bit vague if you haven't seen it. Right? So this is the interface of the product, uh, with sort of this toolbar across the top. And then this blue section is, we just call it a banner. We, so we coded in the display of that, but we're using Revenera to power it, right? So if I wanna make a new one of these, I go into Revenera , reach out, I create it as a campaign, um, put in some XML that XML gets delivered to the product and then the product knows how to render that as text and links.

Daniel F. (00:21:55) - So we said, okay, cool, this is a great surface for experimentation. Uh, one of the things we wanted to do was set up a cadence for, for these different jobs to be done. And so we said, well let's, let's actually do this as a split test. And, and so, uh, I worked with some of our, our user assistance folks and we said, let's have a control, um, that's pretty generic and then let's have a variant track, which is like day one, day two, day three, you know, each day you get a another message and then another variant and compare those three against each other and look for lift, um, lift specifically in, in moving people from trial to, to paid. Um, so yeah. And, and, and that's exciting cuz it's actually really hard to find many ways to test impact of things like this. Um, a lot of our people who are doing this kind of work have to fly blind a lot of the time. So we did find that in this case, you know, uh, we, we ran kind of like a time box test. We're like, look, the, there's the control that did this. Our, our first variant had a really nice lift, the second variant had less, let's put the first variant into production. And that's what gets served up to every new user who comes into the trial and says, this is my job to be done. And then we can let that run and optimize it over time.

Vic D. (00:23:15) - And I, and I know that the key metric is conversion, right? Because that's, that's revenue that's very important. But it in terms of, if I had a new release coming out, I put a lot of effort into a new functionality and I'm trying to discern, you know, is it helping with the overall experience? I think it goes back to your first slide in terms of to, to the value. You could also do it for how much more adoption is now occurring in my product. That would be a percentage lift as well, if it's not just trial, I'm concerned about

Daniel F. (00:23:43) - Yes, and, and honestly, um, you can see the impacts everywhere. So like for example, this is just a sum of the many pieces of content that we have to support the product. And what we found is that you could see lift in usage of that content too, right? So you can look for impact. Um, in many areas people are inter interacting with the content more. We did some, some very large-scale sort of analysis and found that the more people interacted with this type of helpful content, you know, the more successful they were with the product. So that's also good, even if you can't make a direct line, you know, for every link in the chain you can kinda have a, a big picture view as well,

Vic D. (00:24:22) - You know, on your banner. Uh, going back to the image of, so I think I, I certainly like that it looks like to me that you preserve space. You didn't just throw up a message to the user, right? And take over the screen. Do you keep that banner pretty consistent? So there's a top level message. If you want to go more, you click, but we're not gonna force something in front of you at this point.

Daniel F. (00:24:42) - Yeah. So a couple principles here and you know, we evolved over time. Uh, it was with much trepidation that we ever did. The first one of these, I'm telling you because, you know, it is like you don't wanna annoy anybody, you don't wanna Yeah. Make anybody upset. And so we've proceeded really cautiously and listened for any negative feedback. Um, one of the things that we do have is a control, uh, setting in the product. So you can say, I only want helpful tips, never a promotional tip. You know, and we sort of have categories of like, sometimes it's like, hey, there's a new version out or here's a webinar about a new whatever. Um, so you can say, no, I don't ever want those. And then you can say, I wanna shut 'em all off, I never want to see any of any of this.

Daniel F. (00:25:21) - Yeah. Um, we do keep 'em to this, this pretty modest space. Yeah. Um, there is one other type that we do use it, which is more of this like pop-up. And so we use that pretty sparingly. Like right now it runs on first launch and it's really like, let's do this kind of bigger introduction to what's new and, and you see, actually see a lot of products now desktop products, you know, all the office suite does this where they'll pop up something that's like, hey, new in this version. And so, um, I think people are getting pretty accustomed to that, but we make that very video driven very much about, you know, the use case and the value that's in, in this version. Mm-hmm. . And we're doing that both for Snagit and Camtasia now. And that's also powered by webinar as well.

Vic D. (00:26:02) - Yeah. You mentioned the webinar use case, I would think that has, have you measured the increase in terms of webinar traffic when you, you know, based on in product messaging and what that has done for it?

Daniel F. (00:26:13) - You? Yeah, we, we've done some comparison of email versus, uh, in product delivery. Um, and I, you know, I used to write a lot of emails for TechSmith and I think emails are great, medium, it, it's death has been predicted a dozen times and it just, it isn't gonna die. So it's useful. Um, and so I just think having more, more ways to reach people is always good. Um, email can pull somebody back in who has come disengaged with your product in a way that end product can't. But on the other hand, in product can be, you know, super timely and like right there when you're in the product and feel very relevant. So we try to use each one as their strengths and, um, but yeah, the, the amount of attention, the amount of interaction that you get with some of the in product is, is fantastic. It's, it's a great tool to be able to get a message out to people. And again, it's all about, you know, helping your experience be better, helping you to be more successful at what you're doing. Um, so it's, it's really meant to be a win-win and, and for anyone who it's nots, like there's a way to shut it off and that's okay too. Yeah. All right.

Michael G. (00:27:19) - So Daniel, we, I think we have time for one more sort of topic area before we have to wrap up, but obviously you've got a lot of learnings. It sounds like you've got a very mature usage analytics program in place. Can you talk a little bit about, you know, what it looked like back in the day when you were first starting out with this and experimentation and some of the learnings along the way and maybe some of the results?

Daniel F. (00:27:41) - Yeah, so, um, even I guess even backing up before we were using the ReachOut and messaging, I think we've talked a lot about that, but um, but some of the initial things when we first got access to the usage, um, data, were answering basic questions like, you know, like I mentioned, Snagit has these, you know, it does, it's a robust image editor. It also makes simple videos. We often asked ourselves like, is investing in the video component worth it? Are people using it? How much? Um, we had questions about what they were using it for, obviously anonymous usage, analytics, um, and we never collect anything like a URL of what somebody's capturing, right? Like we don't want that, so, but there's deductions we can make, you know, if someone's making a lot of one minute videos, that's different than, than if they're making 15 or 30 minute videos.

Daniel F. (00:28:28) - And so when we did dig, dig into that early, we found, wow, it's surprising that there are a lot of one minute videos and there's a lot of 60 minute videos. What are those? Oh, webinars are 60 minutes. A lot of people are sort of, you know, using this to capture a meeting or some kind of streamed content that they wanna archive for later. So those kinds of insights were really helpful to us, uh, very early on to understand the shape and pattern of usage. Um, also sometimes you just, you need to retire a feature or you're having to make a decision, you know, do we re recreate this feature in the new technology or whatever? Or do we let it let it go? And so that can inform those decisions about how many people are using it. And then we've also done surveying to say if it's a small number using it, let's ask 'em why and how they're using it so we at least understand what those use cases are. We don't make decisions blindly.

Michael G. (00:29:19) - Yeah, I think the ability to point, to sort of figure out who you're gonna ask those survey questions to, you definitely want them to be qualified and whether that's through, you know, achieving a certain level of usage or focused on a specific area of the product. So that's, that's really interesting. So actually I just wanna thank you Daniel, cuz this was awesome. I, I hope everybody in the audience enjoyed it as much as Vic and I did. We really appreciate you taking the time. Yeah, thanks to share, uh, with us and um, hopefully we'll be doing some more of these with other customers getting into different use cases for usage analytics. Cause there's a lot to cover there. Um, if you missed it, which it just happened this morning, um, we do have our monetization monitor report on software use usage analytics 2020 that just came out today. Um, go to our website or social and you'll see links to that. But uh, highly encourage people to download that and see how your usage of analytics compares to, uh, what our survey showed. Um, so with that, um, thank you again Daniel. Thank you, Vic. Um, I think this is a really good session and I will just hand it back to Louise.

Louise S. (00:30:22) - Thank you, Michael. Yes, thanks Daniel. Thank you Vic. And thank you to everybody you attended today. So just a quick reminder, we will be posting out a copy of this recording within 48 hours, and if you have any questions at all, please contact us using the email addresses that you can see on screen now. Thank you very much everybody.

Michael G. (00:30:37) - Bye-bye. Thanks all.