Prove It Works: Ryan Jones on Why SEO Testing Changes Everything

31 min
Guest:
Ryan Jones
Episode
125
Are you just repeating SEO advice without testing it? In our latest episode, Ryan Jones shares why a testing mindset is crucial for SEO success and how to implement it effectively.
Listen To Us Here:
Connect with Michael:
on Twitter @servicescaling
on Instagram @cos71n
on Linkedin
his personal website.

Connect with Arthur:
his personal website
on LinkedIn

Watch our YouTube:
We're posting @watchtheseoshow

Our SEO agency:
Check out our agency Local Digital
Follow our agency Local Digital on Instagram @localdigitalco
Check out our content on Youtube

Show Notes

In this episode of The SEO Show, I, Michael Costin, am joined by Ryan Jones from SEOtesting.com to delve into the often-overlooked topic of SEO testing. We kick off our conversation by discussing the importance of testing in the SEO realm, drawing parallels to the more commonly accepted practice of A/B testing in paid media. Ryan emphasizes that many SEO professionals tend to repeat advice they've heard without conducting their own experiments, which can lead to a lack of understanding of what truly works for their specific situations.

Ryan shares his background in SEO, which began in 2015, and highlights his nearly ten years of experience in the field. He explains that SEO testing has evolved significantly, especially with the advent of more sophisticated tools and data analytics. We explore the two main types of SEO testing: time-based testing and split testing, and Ryan provides insights into how these methods can yield valuable data to inform SEO strategies.

Throughout our discussion, Ryan shares practical examples of successful tests conducted on their own site, such as content refreshes and internal link updates. He reveals that many of their most effective tests have involved simplifying content rather than adding more, which challenges conventional wisdom in the industry. We also touch on the importance of measuring success not just through traffic metrics but by focusing on business outcomes, such as lead generation and conversion rates.

As we navigate the complexities of SEO testing, Ryan stresses the need for a testing mindset—encouraging listeners to question assumptions, treat decisions as hypotheses, and be comfortable with the possibility of failure. He warns against common pitfalls, such as not allowing tests to run long enough to gather meaningful data and misinterpreting metrics like average position.

Towards the end of the episode, we discuss the significance of having a solid foundation of analytics in place, particularly with the transition to GA4, and how SEOtesting.com can help streamline the testing process by integrating data from Google Search Console and GA4.

This episode is packed with actionable insights and encourages SEO professionals to adopt a more experimental approach to their strategies. If you're looking to enhance your understanding of SEO testing and its practical applications, this conversation with Ryan Jones is a must-listen!

00:00:00 - Introduction to the SEO Show
00:00:24 - Guest Introduction: Ryan Jones
00:01:27 - The Importance of SEO Testing
00:02:08 - Ryan's Background in SEO
00:03:04 - Defining SEO Testing
00:04:08 - Common SEO Tests and Their Success
00:06:26 - Measuring SEO Test Results
00:08:01 - Testing Across Different Websites
00:09:26 - Traffic Levels and Testing Suitability
00:12:01 - Real-World Examples of Successful Tests
00:14:56 - Ineffective SEO Practices
00:16:18 - Developing a Testing Mindset
00:20:28 - Common Pitfalls in SEO Testing
00:22:08 - Interpreting SEO Data Effectively
00:24:36 - The Role of GA4 in SEO Testing
00:27:12 - Overview of SEOtesting.com Tool
00:30:02 - Connecting with Ryan Jones
00:30:31 - Closing Remarks

Transcript

INTRO:
It's time for the SEO show, where a couple of nerds talk search engine optimization so you can learn to compete in Google and grow your business online. Now here's your hosts, Michael and Arthur.

MICHAEL: Costin

RYAN: Yeah, thank you very much. Thank you for having me. Excited to chat a bit and learn a bit more as well.

MICHAEL: Yeah, awesome. Well, we're covering the topic of testing today in SEO, which is something that really interests me because in the SEO world, people will often repeat things verbatim that they read somewhere. that they saw online that they haven't actually tested themselves. And they'll speak with very much, you know, a lot of authority or confidence about something, but actually haven't done it themselves. So I always like personally to try and test things where I can just to see how the world works. Um, but yeah, you're here today to talk about testing. So maybe if you could give us a bit of a background about yourself and then explaining, you know, when we talk about SEO testing, what are we talking about from an overarching sense? And then we'll get into the nitty gritty.

RYAN: Yeah, absolutely. So a little bit about me. I started my SEO career back in 2015, straight out of secondary school in the UK. I was kind of one of the lucky ones. I suppose I actually had no idea what I wanted to do when I left school apart from the fact that I didn't want to go to uni, didn't necessarily want to do sort of the college route or anything like that. I went into it on the basis of I liked computers and this was a job that would allow me to work on computers for the majority of the time. And yeah, I ended up really loving it. So yeah, nearly 10 years in now, when August rolls around, it'll be a full 10 years. And then the last nearly two years at SEO testing, sharing my knowledge, that kind of thing, appearing on podcasts like this and helping grow the brand that way. So when we talk about SEO testing, I think there's kind of two descriptions I hear about it. The one that we've used in the past, we've always been doing SEO testing in the sense that when SEO became a thing, we would see something on a site that we weren't necessarily happy with, whether it was traffic, rankings, that kind of thing. We'd make a change based on what we'd seen online or what we'd seen certain creators come out with, make change, and then go back on a Rank Tracker or Google Analytics and look at it that way. But now with a lot more data we have available, a lot more tools that are available, even just like with GA4 and their event-based tracking and things coming in like that, we can test. Not only can we test a lot more, but we can get a lot more data back as well. So we can be more accurate when we look back and say, OK, that was a successful test, or that was something that hasn't quite worked, and we need to learn a bit more about why it didn't work.

MICHAEL: Awesome. So, you know, in the Google ads world, AB testing of a landing page is a pretty straightforward concept. You know, 50% of traffic sees one version, 50% another. You track which one performs better and there's hopefully a winner there. The SEO world, I guess, is a little different because there's so many different variables at play. Can you maybe walk us through an example of what some tests are that you're commonly running and where you see success with testing in the SEO world?

RYAN: Yeah, so a few of the ones that have worked, like for us, we're testing on our own site. The kind of caveat here is just because it's worked for us, it doesn't necessarily mean it's going to work for other sites. But it's always something that will give good ideas and good hypotheses for people to test and things like that. But it kind of sounds a bit stupid to say almost, but what's really worked well for us has been the SEO basics in the sense that two of our most successful tests almost of all time that we run frequently are content refreshes and updating internal links to obviously new content or older content and things like that. But in terms of, I was looking at our dashboard the other day and I think over the 21 content refresh tests that we've done since at least I've been at SEO testing, 18 of those have been positives and have increased clicks by at least 20% each time they've been successful. So that's obviously a pretty good test when it comes to MyBook. An interesting one was, as part of the content refreshes, I think my thought process going in was, okay, this piece of content isn't performing well, so I need to add more content, cover more topics, that kind of thing, try and match Shirt's intent better. I think some of the most successful ones we have had have been by actually removing a lot of content. And this came from sort of research to come out about like vector embeddings and that kind of thing, how Google understands content and essentially removing a lot of filler words. And yeah, I think we managed to double traffic to one post by essentially removing half the content, which it seems counterintuitive in my book, but obviously it works. So we keep doing it that way.

MICHAEL: Yeah. Awesome. And how do you, how do you measure it? So like if, if let's say you're testing a content refresh, are you focusing on just one variable, let's say an H1 tag for argument's sake, and you're just tweaking that on one page and then there's some sort of a control or how do you sort of set these tests up to see what's going on?

RYAN: So we do it, we do time-based testing when we do content refreshes purely because we're testing multiple things at the same time in terms of like so when we'll do a content refresh we usually update the title tag, we usually update the meta description and we're obviously updating the content that's actually on the page as well. With time-based testing that obviously we run on our platform that allows us to It's one of those ones where we're not necessarily going to know the exact change that we made that increased or decreased traffic and things like that, but it allows us to take seasonality out of it, that kind of thing, and test that way because we've got historical data. if a certain blog post goes up in traffic in June, we know that we have to factor that in or maybe hold the test off until traffic has returned to normal levels. But then for other test types, like changing structure and things like that, then that's where we like to use split testing, which is exactly what you've described of obviously using control variants and test variants and things like that, and that's a much more accurate way of testing that kind of thing. So yeah, there's multiple test types for different types of tests that you want to run.

MICHAEL: Yeah. Awesome. And do you find this testing, does it sit in isolation? Like what works for one website might not necessarily always work for other websites. Like are you able to roll out the findings from a test to multiple sites or are they sort of working on a per site basis?

RYAN: It is one of those ones that works on a per site basis in the sense that like we, because obviously we have access to like client tests and stuff and we occasionally look at those. So we found that like stuff that works really well for us might not work for another site, but what obviously works for them might not work for us. And it's one of those ones that I imagine we'll probably talk about it in a little bit of building a bit of a testing mindset in the sense that we want to have a big backlog of tests that we've run in the past. So we can kind of then have a better idea of what's going to work and to what level. Obviously, with the caveat of things in SEO change, so something that worked really well a year ago might not necessarily work as well today. It allows us to be a bit more educated when we form hypotheses and things like that, or whether we want to change something, we can go back on that data. We can say, almost every time that we've run this type of test in the past, it's been positive. We can look back on it that way and try and be a bit more educated about it.

MICHAEL: Yeah, great. So am I right in assuming that this sort of stuff, you need some sort of baseline level of traffic, like it's going to suit certain types of sites, maybe not others, you know, like a small local business. They might not be suitable for SEO testing because they don't have the traffic levels. Whereas maybe a big e-commerce site is more where you see this stuff working.

RYAN: Yeah, generally. I mean, it is always advisable to test on sites that have at least a decent level of underlying traffic. But that's especially for like split testing and things like that when you're running with like controls and test variables and things like that. And yeah, certain like local businesses might not be a candidate for that type of testing. day might be more of a candidate for time-based testing because you can work on existing traffic levels. As long as you've still got a decent amount of historical data there, you can test based on that. The only thing you have to keep in mind with that kind of stuff is If you're working with lower traffic levels, you might have to keep the test running for longer to make sure you reach statistical significance, whereas an e-commerce site that's getting thousands of clicks a day might reach statistical significance quicker, so they could finish a test up in two weeks, whereas smaller local businesses might need six to eight weeks. But that's something to keep in mind. And then, obviously, just with that, it's keeping an eye on like Google updates and things. We obviously know that Google updates their algorithms thousands of times a year and they mostly don't tell us when they do change things, but keeping an eye on when they actually do confirm stuff like the March core update that's just finished rolling out. But at the same time, you can keep an eye on the volatility trackers as well, so you know if things start to peak a little bit, then you can look at it that way and say, maybe there's something algorithmically going on that we might want to keep an eye on. But the kind of overarching point is with time-based testing for smaller, more local sites, as long as you've got a good level of historical data to work on, even if it's traffic's lower and it's been a bit stagnant for a time, as long as the amount of historical data is there that you can take sort of seasonality out of it and that kind of thing, then you're probably a good candidate for that type of testing.

MICHAEL: And do you have any examples that come to mind of tests that you've done or maybe that you've seen in client accounts that have really moved the needle? You know, their traffic was down here and now it's up there because of something that they've discovered in testing or something that they're able to unearth using these principles that you talk about, just so people get an understanding. I guess the theory of testing is all good, but what does it actually translate to for people in the real world in terms of results?

RYAN: Yeah, one of the things, so I mean, I suppose you could potentially class our site as kind of one of those smaller sites. We don't necessarily get like a whole lot of traffic. We have enough to test on and that kind of thing. But one thing that works for us, so when I first joined the business, obviously the kind of growth needle for us was publishing a lot of content in certain clusters and that type of thing. So we were focused on putting out a lot of content. And what we found was that kind of stagnated after a little while, the returns we were seeing kind of slowed down a little bit. And one of the things that we did, we had an ongoing process of creating videos for YouTube for most of the content that we produced on the website. just as a kind of like, almost like a content distribution model. We obviously share it on social media, that kind of thing, share it over YouTube. And we had the idea of, well, okay, why don't we embed these YouTube videos to the top of the fold, like right at the start of the content, have like maybe a little introduction, then just embed the video there. Because People obviously will learn stuff in different, maybe someone doesn't want to sit down and read a whole 3,000 word blog post or anything like that. But if they can watch like a five, six minute YouTube video, then they might be inclined to stay on the site a little bit longer or finish watching that and then maybe click through to some different pages. And then there's all kinds of factors, honestly, that Google might be tracking, or maybe, maybe not, of like, okay, they click something additional on the site, i.e. that video and that kind of thing. And what we found that was for almost every video we were embedding onto the site, the traffic went up. And even if the traffic didn't go up, i.e. more people weren't coming to the site, but the people who were landing on the site, they were staying longer. They were clicking through to more pages. They might have actually been like signing up to more trials as well. So that was like overwhelmingly positive for us to do that kind of thing. So when it comes to smaller sites, it's one of those ones, once you have an idea, you can absolutely test it. And then obviously, if it's positive, you know that. Because we only did it on a small subsection of pages at first. We didn't want to go through the effort. It wasn't going to be worthwhile for us. So that was one of those ones that we had an idea about, we tested it, it became overwhelmingly positive. And that was then a process that we rolled out across almost every single blog post on the site.

MICHAEL: Awesome. What about conversely, have you found anything that is maybe promoted out there in the SEO world that when you test it doesn't have an impact?

RYAN: Yeah, an interesting one was, I know the standard advice that goes around, like at the start of a new year, update all your blog posts from 2024 to 2025. We did that just because it was obviously the standard thing. And yeah, for our site in particular, you could probably say it was negligible in terms of the impact that it had. In actual fact, we actually saw a slight drop in traffic for doing that. The reasoning for that, I don't necessarily know. I don't know whether it was because Google was realizing that we were changing literally just the title we were following through with any sort of actual updates and that kind of thing on the content. So they were noticing a change. And then maybe also going through and saying well, okay, they've changed a year, but they've not changed anything else So let's just leave it as it is and that kind of thing. But yeah, that's that's like a decent example of the the kind advice that you see all over like LinkedIn and X and things like that and That we did and it just yeah, it just didn't have any impact at all. So I suppose the done thing that would have been like, okay, maybe every blog post that says it's a guide for 2024, maybe we actually should have updated that at least a little bit and then update it to 2025. But yeah, that's, that's always an interesting one of the things I see that like, you should absolutely do this or maybe you should only do it if you're actually updating the content as well.

MICHAEL: Yep. Yep. Anecdotally, I was on Twitter a while ago and I saw someone, I chimed in on some SEO chat and someone very confidently spoke about how guest posts don't work anymore. Like if you buy a guest post, you won't rank. So I went and bought two domain names, very close to each other, just using a gibberish phrase that if you search Google for it, it didn't exist. I put the exact same content on each page, got both sites indexed, but then I built a few guest posts to one of those pages, checked the rankings, and then the one with the links ranked higher than the ones without the links. So I was able to go back and say, well, look, I tested this and guest posts do work. So are you just saying this or are you actually testing things? That mindset of testing is something that not enough SEOs really, I guess, have. You know, they do sort of follow the advice generally that's out there, you know, from the gurus in the space or YouTube or wherever they find it. But you mentioned earlier developing a testing mindset. What does that actually look like in practice? Like how can somebody go about not just doing SEO on their site, but trying to think about this concept of testing?

RYAN: Yeah, so I think when it comes to developing a mindset about testing, I think the first thing to mention is exactly what you just said there of questioning the assumptions rather than just accepting the SEO rules that you see on LinkedIn. One of the things I like to say to people as well, if they ask me for advice, or maybe if they've seen me on a podcast or they come up to me after a talk I've done or something, they ask for my advice on things. One of the things I do like to say is you shouldn't necessarily do something that I've said has worked on our site just because it has worked. Don't listen to me. You can take ideas, that's absolutely fine, and then obviously go and test it on your own site. But I think it comes down to, yeah, obviously, questioning things rather than just following it as the standard device, treating all decisions as hypotheses rather than big permanent decisions that are going to be made and that kind of thing. Because at the end of the day, if you test it and you test it on a smaller subset of pages and it doesn't have the impact you have, It's not the best idea to then just say, well, we've committed to doing this now. Let's roll it out anyway and see what happens. It comes through with the habits that you have of regular, the kind of habits that we all have anyway of keeping an eye on data and things like that. So we notice that if a test we're running has a really negative impact, then again, it makes sense to stop that completely and don't necessarily run the test to completion. because at the end of the day, it's having a very negligible dual impact. There's obviously some tests that we see that will have a slight dip maybe while Google re-indexes certain things, and then we see an increase from that, and that's one of the ones where we can kind of sit there and let that happen. Obviously, if you change something in rankings and traffic tank, then don't just sit there and say, oh, it'll come back. It'll come back. But yeah, you can absolutely say, OK, that was a failed test. Let's stop that. And kind of following on from that, I think a big thing is being comfortable with being wrong. It's one of those ones. When you test as much as we do or as much as people who have this sort of mindset, not every test you run is going to be successful. And the important thing to come back from that is, okay, let's let's sit down and work out why it wasn't successful. Let's let's not just okay, it was a failed test. Let's move on. But try and try and work out the reasons why and then and then sort of rework it and then kind of test again on that kind of thing. So yeah, it's, it all just follows through from the sort of mindset that you have about things don't don't treat things as as permanent changes, treat things as hypotheses to test on, realize that nothing's nothing's permanent. And that kind of thing.

MICHAEL: On the topic of failed tests, do you see any, I guess, common pitfalls or mistakes that people are making when they get into this space?

RYAN: Yeah, I think one of the biggest mistakes I see is not letting tests run for a long enough period of time and that kind of thing. So they might leave a test running for a week, see a little bit of a downturn and then say, okay, that's failed, let's turn that off. So I think it is important for, unless, obviously, like I just said, unless traffic and rankings really tank, then you can probably leave things there for a good amount of time until you've got that statistical significance. And then another test, another mistake that we see, purely because people are testing different ways, but because of how our tool is built, we use Google Search Console data to establish whether tests have been successful or not. And we have had some people that come to us on the site or come to me directly and say, oh, I started to run a test and then my average position really dropped. And I'll log into the dashboard, take a quick look, and I'll see that the average position dropped because they start ranking for a lot more queries. So it makes sense for the average position to drop because if previously they rank it position five for two queries, and then all of a sudden they start ranking for a few hundred queries, it makes sense that that average position is going to drop. So it just comes to don't look at certain things in a silo and look at all the data as a whole and then make a decision from that. But I think those are the two most common ones.

MICHAEL: Yeah, awesome. That's a great point you make about, you know, average position. It's how you interpret the data. And I know when you run an A-B test, you can be guilty of, if you have a favorite out of the two options and it starts doing really well, you can think, all right, I'm onto a winner here and go with it. But over time, those initial results might fade off. Maybe in this space, there's certain data or stuff that SEO should be collecting or incorporating in their tests or being across to make these tests meaningful. What are some stuff that people might need to be aware of? Is it as simple as like, this got more clicks or are you sort of looking a bit more in depth when you're measuring success?

RYAN: Yeah, it really depends on what the goals are for each individual business who are turning to testing and that kind of thing. So for us and certain tests that we run, it might just be a case of if traffic goes up, then that was a successful test and that kind of thing. I think the point I like to make, and it's a point that a lot of SEOs quite rightly make, is don't necessarily sit there and take traffic as the metric to be tracking. What you really should be tracking is the business impact that it's having. So if we run a test, because we will track GA4 events and stuff, so we have all our key events set up of people signing up for trials and that kind of thing. So if we make some changes to a piece of content and we see that, okay, less people are now landing on the site, but the people that are landing on the site are much more sort of qualified leads and they're converting at a much higher rate, then that's 100% a successful test for us. Even though you might look at it from the outset, NKR, well, traffic actually dipped 15%, whereas what we're actually doing was getting rid of those unqualified visitors and that kind of thing, and we were ranking in more of the right places and bringing that more qualified traffic through. So I think it's always wise to keep an eye on the SEO metrics, traffic rankings, that kind of thing, the number of keywords you're ranking for, but also keep in mind tracking your business metrics as well, which is… It's easy for people like sort of e-commerce sites and SaaS sites and things like that, where the sort of metrics really are tied into the website. But for a local site, it might mean tracking phone calls and things like that, that come through from certain pieces of content and that kind of thing. But yeah, it's always worth keeping an eye on that because I've seen a lot of tests where traffic has gone down, but leads have gone up and that kind of thing. And that's always a successful test in my book.

MICHAEL: Yeah, absolutely. A lot of people get wedded to the vanity metric of traffic when they had maybe a blog post ranking for some long tail turn that did absolutely nothing for them. And you quite rightly say it's about making more sales at the end of the day for most businesses. That's why they're interested in SEO. GA4, how do you find that universal analytics? Great. GA4, I personally find it painful. I think a lot of local service businesses probably don't have a great base of analytics in place. Is that probably something that needs to be addressed before you even start looking at testing? Like having that bank of data there, tracking those events in GA4 as painful as it is to set it up and having some data there to work with before you go down the path of testing?

RYAN: Yeah, I think, I mean, I suppose to answer the second question first in the sense that you don't necessarily have to track business metrics and stuff to run successful SEO tests. I say at the end of the day, like, you might just be aware of your conversion rate just because of UA or because of like the basic metrics that you do have set up in GA4. You might know that your conversion rate might be 2% and then from there you can obviously work out the business impact of a successful SEO test and that kind of thing. But I'm actually a big fan of GA4 once you get around the initial learning curve of like, especially when it first rolled out and UA disappeared and logging into it and looking at the dashboard and kind of hating it for probably six months. But yeah, when you actually drill down into it and yeah, you have events set up and things like that, even though they are quite painful to set up the first instance, especially if you're a site that like for a SaaS business, I think it's a lot easier because we only have a few key events that we really are tracking. But when you're a multinational e-commerce business, for instance, you're going to have a lot of different events. You might have people sign up for newsletters, obviously people purchasing and things like that. But yeah, when you've gone through the initial pain of setting everything up, it is definitely an advantage. Because SEO testing does this, and I know there's different tools out there that have GA4 integrations in as well. which means that you can just obviously set up an SEO test really quickly and then just click the key metrics that you want to track as well, all the key events that you want to track, and then that data will be included at the end and you can just look into it and say, okay, well, before we made the change, we had this many key events completed, and once we made the change, we had this many. And you can say that, okay, this was a really positive test for us as well.

MICHAEL: So maybe, could you let me know a little bit more about SEO testing itself? Am I right in assuming that it plugs into your search console GA4 and it sort of aggregates the data in that and then it'll have a look back for a time period and you can compare, you know, stats for the different variants and that sort of stuff. Is that how it works or is there more going on with the way the tool works?

RYAN: Yeah, exactly that. In terms of the testing point of view, that's how we run the tests. We'll take data from Google Search Console and GA4. Obviously, if people have the GA4 integration plugged in, which most people do. But yeah, we have different types of tests available. We have time-based tests. We have three versions of them. So you can test, obviously, single page, a group of pages, or even URL changes as well, which is good for testing redirects and things like that to see whether that has a positive impact. Or another successful test that we ran with that was taking one large piece of content and then splitting that out into different smaller pieces, that was a successful test for us. And then we have split testing involved in the tool as well, where people will be able to say, okay, I want these pages as control and these pages as a test, and we'll aggregate the data that way. But yeah, there's certainly a lot more to the tool than just obviously testing is the main part. Where the tool was first built was, it came, I think, purely out of convenience. So Nick, who founded the tool, he started building it in 2019, purely out of a need to, he got really tired of just having the 16 months of GA4 data available. And he said, well, what if I build a tool that will archive it for us, so then I can go back in a few years and I have like years worth of GSC data available. And it came from that. And then from there, the idea sparks of, okay, if I've got all this data available, maybe it makes sense to add testing functionality and things like that. And then, yeah, because we have access to all GSC data, we've got a lot of reports and stuff that just saves SEO time, really. We do say that the reports that we create are 100% things that people can do with GSC itself, but the big thing is it takes a lot of time and obviously being an SEO, especially if you're like agency side, for example, you only have like a set amount of hours to work on certain clients and things like that. So if we can save SEOs time, then we've done our job well as well.

MICHAEL: Yeah. Awesome. Love it. Well, this has all been great. If people want to find out a little bit more about yourself or maybe the tool, where can they go to connect with you once they've listened to the show?

RYAN: Yep. So I'm on all social networks. So LinkedIn, X, Blue Sky, Threads, all that kind of thing where you'll see me just talking my ass off about SEO and that kind of thing. Lots of threads, lots of posts and things like that. But you can connect with me that way. And then yeah, it's SEOtesting.com if you want to learn more about the tool.

MICHAEL: Awesome. Well, Ryan, that's been really interesting. Thanks for joining us on the SEO Show.

RYAN: Thank you very much for having me.

INTRO: Thanks for listening to the SEO Show. If you like what you heard, don't forget to subscribe and leave a review wherever you get your podcasts. It will really help the show. We'll see you in the next episode.

Most recent episodes

View all Episodes