Moderator: Could we start with a brief background of your career and your marketing journey?
Paresh: I've been doing marketing for about 12 years now, specifically in the SaaS space for 10 out of those 12 years. I've spent ten years doing and scaling primarily marketing teams for small to medium-sized SaaS companies. Most of the organizations that I have joined in the SaaS space have been in the range of $10 to $25 Mn in revenue. And I think I've found my strength in trying to join organizations in that phase of their journey and scaling them from there to 2X to 5X. I started my career with Advanced American in LA, first in an offline marketing role and then a digital role. And here I found my interest in digital. Post that I moved to Cvent and I spent four years in a marketing operations role. By the time I left them, I was leading their email marketing, marketing automation, their database, and their events execution team. Posts that I joined RateGain which was a sales-driven organization. I joined when they were setting up marketing and in three, I set up a team of 14 people there. During the same time RateGain’s revenue increased by 3-4x. Beyond that, I worked a brief two-year between two companies- BirdEye and Altudo. I've been at Wingify for almost two years heading marketing. It's very different from all other organizations I've worked at because Wingify is primarily funded by their customers and the challenges that come with being a bootstrapped organization is very different from a funded or VC backed company. But the concepts of marketing don't change. That's pretty much a high level so far about my journey.
Moderator: Can you give you an overview of how the marketing organization is set up at Wingify and how is it different from your previous SaaS employers? And what these different teams do and their north star metrics.
Paresh: I would say the marketing team at Wingify keeps evolving a lot. One thing that is different for Wingify's marketing team compared to any other organizations I've worked for is our propensity to constantly change to whatever needs to be done. In the last two years, we've constantly tried to pivot and change how we are approaching marketing based on what is working and what is not working and using data a lot more. What I'm trying to tell is the current marketing structure may not be a true presentation of the entire last two years as it has constantly changed and evolved. We've structured it currently into three different areas. One team focuses on capturing everybody who is in the market today who at any given point of time are actively looking, considering, or evaluating a product like what we offer. Others call it demand gen as well. Now to capture those people, there are two aspects to it. One is ensuring discoverability. Meaning they are able to discover VWO or Wingify right at the point where they really need us. The second is then convincing them that VWO is a good fit for them and they should definitely try us or work with us and see if we are able to solve their use cases. The largest sub-team is called Harpoon which you can think of as the team responsible for harpooning uses for our products. This entire team just focuses on the keywords, which in the past have converted well for us, which we know have a higher probability of people searching because they are evaluating something. And also focusing on everything else that we can do to find these people and convert them. And a subset is paid where we are focusing on paid, across Google, LinkedIn, and other channels. The next part of Harpoon is content. And the way we think about this harpooning content is both helping in discoverability and helping in convincing. Anything that we write is very focused on conversion but it has to be good enough so that people are able to discover us.
Then there is another set of people who are not in the market today but are our ideal customer profile. They're not ready to buy our tool or engage with us today. What is important in that sense is then brand recall- how do we constantly stay in front of them. So whenever in the future there is a need, whenever they graduate from not being in the market to being in the market, we are able to then convert them better or we're able to cast a good net. The brand team supports all this. Because if there is constant brand recall, the path for us to convert them is going to be a lot easier. When I talk about this second pillar and brand recall, there are a lot of things right from educational content, brand-related campaigns, creating a category of it, creating a community, all of this stuff happens across in this second bucket.
The third is purely marketing operations where three things happen. One is ensuring that all the processes and data flow is happening correctly and that all the systems are being used to their maximum potential. The second part of it is experimentation. Because we are an experimentation company we like to construct experiments. There are people who just focus on constantly experimenting to improve efficiency across different projects. The third which is not directly a part of marketing, but in the last six months, we have seen a more and more dependence on is Analytics. Where somebody is constantly giving us insights and keeping us updated about what's happening and how we will need to change or correct course based on data. In the last six months, we have started focusing a lot on analytics. This is how the team is structured at a very high level.
Moderator: How big is each of these sub-teams?
Paresh: The harpoon team has nine people. The brand part is smaller right now but we have a goal to get this to five people and then everything else is another seven people.
Moderator: What are the north star metrics for each of these sub-teams?
Paresh: For a harpoon, it is pretty simple. Ultimately it's the revenue that we want to chase. But what we realized was that there could be early milestones. So be it MQLs or getting conversions from MQLs to opportunities is one side of the story. We track MQLs, opportunities, pipeline, and all that good stuff. But ultimately for business, revenue is something that is the most important thing. And if you think about marketing it's very difficult for somebody sitting in the harpoon team to influence what happens after a lead is generated. Once it's handed over to sales we can track what's happening with it and can follow up. But when you're getting 3k-4k leads a month it becomes very difficult. Also, somebody in marketing wouldn't have control over what happens on the sales side. Hence we realized it was very difficult for us to give revenue as a north star metric to the harpoon team. So what we did was we took all the historic data and we came up with a correlation of 20 different lead level attributes/variables like industry geography, technologies that prospects use, Alexa rank, etc with the actual opportunities that were created. Based on that, we came up with a model where we started assigning predicted opportunity scores to every lead that came in. Now, the average predicted opportunity score of the month is the north star metric for the team. And the way to control that is two-fold. One is if I focus on increasing the number of leads and let's say the quality of the leads doesn't change over a period of time. That's fine because you're meeting your threshold and you're increasing the number of leads and opps. The other way- if my number of leads are decreasing but the quality or the opp score is increasing, that's still okay as well because the number of opportunities will still be higher. Ultimately what we care about is the number of opportunities. Now, as a harpoon person, I can decide that I want to focus on whether to increase the lead volume by keeping the quality to a bare minimum or keep the lead volume the same and increasing the quality e.g., more enterprise leads or more leads from the US. It also helps us if our average opp score is increasing month on month, but the actual opps are decreasing then most probably the digging needs to start from the sales' side and we need to go and fix that. And the other way around is if the average opp prediction score is decreasing, then something is wrong on the marketing side, which we need to go and fix. So there is that clear handoff between sales and marketing is the average of sort of the month. As long as the average opp score is increasing month over month the harpoon team has achieved its job.
Moderator: So do you also use lead scoring models separately?
Paresh: We also have a lead score which is primarily based on demographics and behavioral attributes and is used for prioritizing sales calls. So a lead scoring model is based on somebody who has visited your page five times versus somebody who's just visited your page once and then it helps us decide who should we be calling first? I would rather call somebody who's visited the page five times. But the opp score model is more aligned with the business metric of opportunities and revenue.
Moderator: Paresh, you talked about the north star metric for the harpoon team but what about the brand team?
Paresh: For the brand team, for the first year and a half and even for the longest time in my career, we've always looked at just one angle which is an increase in leads and opportunities. But what we realized was that if we have to scale the brand and we have to justify investments in branding, we'll probably never be able to scale that to a level where we want to if we keep looking at that as main metrics. Because whether we like it or not, running brand campaigns is not going to give us an uplift of leads or opps in the short run or whatever correlates with business. But in the long run, yes. As there is no doubt about that as you need to have a great brand perception to go after your ICP, especially if your ICP is an enterprise. So what we decided was the north start metric for this brand pod was the consumption of the VWO logo or the content which we have tagged as brand content and not harpoon. We also did this exercise to tag everything we produce as either brand or harpoon. Something like a Masters of Conversions series that we do is just hosting industry experts to do training sessions that have got nothing to do with VWO. We don't even mention or sell VWO. It is just purely an industry initiative to help people get excited about or learn about experimentation. Now that's a brand kind of content and we want that to go as wide as possible. So what we realized was we're not going to get MQLs out of that. We don't even ask somebody to fill up an email address to view the recording. What happens in that case? How do you justify doing more and more of that content? Then it comes down to brand consumption. So what we've done is we've built a model where you've kind of figured out all the channels where we are focusing on from a brand perspective and we are flowing in all the data. I'll give an example. We recently did a podcast sponsorship? And after a month of that podcast sponsorship, we got the data of how many people downloaded it and what was the average listen time. And then we simply added that as X number of hours content was consumed. Similarly, for our Masters of Conversion series, we know how many views we get on the recording and what is the average time of the views from this year, then that gives us the average hours for these videos. So we have everything listed as which is brand and that's the way we calculate average hours spent consuming VWO logo or brand content. And the goal is to 10x that. If you 10x the views on hours consumed then by default the brand perception, brand visibility, and awareness will increase.
So the north star metric there is number of hours of VWO logo or content consumed by ICP. When we sponsor digital events we just have a logo in a 30-minute presentation and if 500 people were present and they were looking at our logo for 30 minutes. So what does that relate to the number of hours and yes that's the north star metric.
Moderator: How do you measure the efficacy of marketing at Wingify? Can you get into specifics of metrics because there will be different metrics discussed in a boardroom meeting versus when you're having a marketing team's weekly or monthly meetings?
Paresh: What I realized is those two things can't be different. Because if you're discussing some metrics in the boardroom and you're discussing other metrics with your team, there will be a gap. And that's the reality. That used to happen a lot. At least at Wingify, it doesn't happen. But I'm sure in other businesses that does still happen where there is just no connection between what your teams are chasing and what as a company the expectation from marketing is. And then I do feel that it's still a large challenge where even if those two are aligned, there is no agreement on how that gets measured. Ultimately in a boardroom, everything will go to revenue. Maybe you would have done better compared to last quarter but unless the organization meets its revenue goals, you're never going to get a pat on the back. So at Wingify, from a measurement perspective, it is how we've aligned our top management and board to look at that as well. That is the average opps score month over month. And what we've also done is we've figured out a way to predict the number of opps that will be created in a month based on the number of leads and the opp score. The multiplication of leads and opp score tells us the predicted number of opportunities are going to X. So as long as the number of predicted opportunities being created are increasing, marketing meets its goals. E.g., in the goal sheet of the content writer from the harpoon team, the goal is that whatever content you produce, how many number of predicted opportunities are you going to deliver that month? I think it was super essential that we align everybody in the team to that one goal. Because the moment you start deviating from that, it just doesn't work. We do assign a quality score for each content piece, we do track the traffic and other useful metrics but ultimately it is a lagging metric that is important as well i.e., how you'll be scored at the end of the year will be based on how many predicted opportunities were created month on month. That metric doesn't change from person to person in their team. What changes is the projects and tasks to do to achieve that? At least at Wingify, that's how we have structured it.
But I hear what you're saying- for example, at RateGain, the good thing was that RateGain never had a marketing team in the past. So what we cared about in the first year was brand awareness. We did not even care about leads. So I do believe that different companies at different stages will have different boardroom metrics. And over a period of time, they would converge. But for us, the first two years, all we cared about was brand visibility and brand awareness. Because it was a niche, it was the hospitality industry and we know there are X number of hotels in the world. And the way to measure success is by knowing what percentage of the hotel know RateGain. And a simple way to measure that was, at the start of the year, we surveyed all the salespeople and said- "How many percent of the calls or walking meetings that you do, where they're already aware of RateGain and you don't have to explain what we do?". And we got a number at the start of the year and then at the end of one year after doing everything that we had to do, we did that survey again. And we saw what was the delta between that number and that was what we were chasing right. Again at the same time, there were different teams- paid and SEO. Of course, they were chasing their own kind of metrics i.e., the standard stuff. But the boardroom metric was brand awareness. So what I'm trying to get to are two things. One is the challenge that I see or I have faced throughout my career is that not a lot of marketers, including myself, fail to set up that one core metric or the boardroom metric that we all have chased. Secondly, it is absolutely essential that there is a sign-off from the board or the management team and even the marketing team on that. And once it is achieved, we shouldn't shy away from calling that we've achieved whatever we set out to do in marketing.
Moderator: Paresh, you mentioned the quality score for content pieces? What's this quality score and then how do you calculate it?
Paresh: The quality score is built on a framework in which we decided that the content has to be compelling, logical, authoritative, or original value, but also persuasive. We have an editor who is neutral and her job is to review all the content and give a quality score. This also ensures that way there is no cannibalization of interests. Based on the framework this editor scores the content for a few parameters which help us understand if the content is compelling, logical, etc. And the content piece gets published only if it hits a certain minimum score. Once the quality score was built, though the data is thin for this, what we also saw was that there is a decent correlation between the quality score and the page views and MQLs till now.
Audience Question: How does the opp score gets tied to the quality score?
Paresh: These are two separate metrics. The quality score is pre-published and ensures that anything that we produce meets a certain standard. We did the correlation between MQLs and page views with the quality score of content to validate the framework. Maybe that's something eventually we should think about how we can connect these two. But once we have enough data, we could take a step further and actually see the relation between the quality score and the opp prediction score from that particular article. And if we are able to find correlation in that then nothing like it. Ultimately a high-quality score should result in a high opp prediction score.
Moderator: After the content is released, do you monitor the engagement from the user's side, in terms of time spent and scroll, or is it like a good or bad experience?
Paresh: Yeah, not as much as we would love to. We do still look at all the vanity metrics. There are weekly reports that get sent right from search trends, page visits, bounce rates, etc. We have scheduled reports for that but I don't think we get enough time to look at that and make sense out of it and constantly take actions based on that. I think that's one big challenge which I believe not only us but most marketing teams face is there's so much data and the decision then is about how much time you spent in which area. And unfortunately, we haven't been able to spend enough time in being able to decode all that. If there is a major change then that's when we deep dive into it else we don't proactively look at it
Moderator: What would you consider to be the biggest challenge within marketing in SaaS companies of like $10 to $20 Mn space?
Paresh: I've seen a lot of SaaS companies, especially marketing teams, give up too early on initiatives. I was reflecting on 2020 on what are the top learnings from 2020 and the top learning that I wrote was- not completing the feedback circle. Typically what happens is we will do a lot of campaigns and projects and also execute them. We feel happy. We move on to the next one. But we never sit and decode that or try to see what impact it had on business metrics and complete that circle. The other way around is we start projects or campaigns and time is limited. We are all busy and something better and shiny comes along and we move and we just abandon that.
The second big thing, I would say not a challenge, but the trend that I'm seeing across the board is using data a lot more in decision-making. In the last 6 months, we have gotten better and doing our bit. But the challenge that I find is that not everybody in the team has that mindset. Not everybody's constantly thinking from a data perspective and it is really hard to get all 20 people in the team to constantly take decisions or think from a data perspective. That's one big challenge that most marketers are facing. Everybody understands that to be successful over the next couple of years, marketing is going to be a lot more data-driven. The expectations are skyrocketing and marketing is expected to directly contribute to revenue. That was not the case back in 2010. Then it was leads, then in years, it became qualified leads and then opportunities and then pipeline and now even revenue. And to make that happen, we need data. Without that, it is very difficult because marketing isn't talking to the customers directly. What we have are insights, past performance data, and customer-centricity. Now the challenges, how will you make everybody in the team use this data? That's a problem that is difficult and not solved and I don't even know if this will be solved in the long run.
Moderator: Describe your dream marketing analytics tool. You can get creative as possible- a tool that will solve all your marketing problems.
Paresh: I would really love a bot that constantly every day, every hour sends me insight which you should look at based on all the things that I see or my intelligence.
I think I have two challenges there. One, I personally, and I believe most of the leaders, don't have enough time to spend looking at data. Then understanding analytics, figuring out insights, and then how we should use them in what we are doing. The second is that I don't have a dedicated team to do this as well. And I don't think a lot of companies in our range $15 to $40 Mn have like a dedicated three to four people team to constantly be able to do this. The third is the problem I spoke about, how do you get everybody in the team to have that mindset to be able to mine insights from data. And data keeps increasing every day. So I don't know, maybe like a bot or any AI that is constantly looking at these zillions of data points and that is trained enough to come up with one or two key insights of the day and send it to me saying- “Today, think about these two things and work on these two things. It will help you improve your business every day”. If I am pushed to think about two key findings per day and over the course of the year, I will be thinking about acting upon at least 400 or 500 of these insights versus maybe 50 that I do today. Maybe this is wishful thinking but I'm just thinking from a first principle and actionable perspective.
Another problem I think is the customization of dashboards and visualizations to my needs. You can get it done with Tableau or Looker and with the help of a couple of experts after spending some time. But in my experience with Omniture and Bizible, primarily Omniture, I found it very difficult to customize that whole thing. It took us a year and a half to implement Omniture at Cvent and get some value out of it. And I don't even know if we saw some value of it at the end of the day because the steam was lost. So I would love a super customizable dashboard without relying on others in the team. So today I have to rely on the BigQuery or Analytics team and there is no way I can go build quickly what I want beyond the typical Salesforce or Marketo reports.
Moderator: I think in that case you're going to love the next version of Factors! If you were to refer one marketing leader for our sessions, who would it be?
Audience Question: Paresh, you had mentioned there is so much data, and hence do you see more challenge in where people really want to figure it out but they're just not able to work with what they have currently, or is it that they don't even want to figure it out and they are being forced to do it?
Paresh: It's the former. In fact, in the last six months since we kind of made a lot of these dashboards and data available. Initially, even all the leaders were involved but now I see requests for a filter or a view daily. A lot of people have directly started working daily to come up with their own set of dashboards. So I think it's a mindset. The majority of the people want to do it. It's just that they find it too overwhelming or difficult. But the moment the data was made consumable via clear dashboards with filters where they can slice the data, 70% of the marketing members have started using it weekly if not daily. And which is a stark difference from when we didn't have anything I do feel people want to use the data. It's the availability and ease of use and their comfort level that needs to change.
Audience Question: Paresh, you described the challenge of getting the team to look at data. But if not data, how do they usually or used to make decisions or are naturally interested in? Are there set models, intuition, or emotional factors?
Paresh: People just start so focused on delivering projects. The shift needs to happen right from the goal sheets. I think one reason why everybody's looking at data now is that on the goal sheet we have connected their rating and goals to the north star metric. In the absence of that e.g., a content marketer is asked to publish quality content and he/she will get into the loop of just delivering the projects. So more than interest, I think it's about the environment and how you really push them towards data.
Audience Question: What was the one concept or metric you faced a challenge quantifying may be due to uncertainty, lack of data or coming up with the right formula?
Paresh: The biggest problem that all marketing leaders would have is marketing and sales handoff and alignment. E.g., the sales team thinks marketing isn’t getting good quality leads and hence the number of opportunities or deals is decreasing, and vice-a-versa where marketing thinks the quality of the lead hasn’t changed but sales are doing something wrong hence the less number of opportunities or deals. And like I mentioned earlier, we solved that by introducing opp score. If the opp score is increasing, ideally the number of opportunities should increase and if they don’t we know we need to fix something on the sales side. And if the opp score is decreasing we know we need to fix something on marketing’s end.
Audience Question: How do you promote and celebrate being data-driven in your marketing team?
Paresh: Apart from posting in the Slack channels or an email, in the last six months we have started pushing people to share their learning and we have also made managers responsible for that as well. Even when I figured out something, I built a doc and shared the learning with them. So if anyone does something new and learns, we have made it a habit to share it inside the group and kudos follow.
Moderator: Thanks a lot Paresh. We got more than what we could bargain for in terms of learning in 90 minutes.
Paresh: I'm happy I could play a role in what you guys are doing. Bye!