Honest Ecommerce

Bonus Episode: Big Swings vs. Small Wins: Embracing the Testing Mindset

Episode Summary

On this bonus episode of Honest Ecommerce, we have Adam Kitain. Adam is the CTO and Cofounder of Intelligems, helping e-commerce brands optimize conversion and profits. We talk about adopting a testing mindset, balancing creativity and analytics in a CRO, iterating quickly after test failures, and so much more!

Episode Notes

Adam is the founder and CTO of Intelligems, a profit optimization engine for ecommerce brands. 

He and Drew Marconi started Intelligems 3 years ago after spending 4 years building dynamic pricing in the ride sharing industry in order to bring that level of sophistication around pricing and economics to DTC ecommerce. 

Intelligems today enables brands to A/B test content, pages, discounts, shipping fees, and prices on their storefronts. It also enable brands to build tailored experiences and personalizations informed by the learnings of those tests.

In This Conversation We Discuss: 

Resources:

If you’re enjoying the show, we’d love it if you left Honest Ecommerce a review on Apple Podcasts. It makes a huge impact on the success of the podcast, and we love reading every one of your reviews!

Episode Transcription

Chase Clymer

Hey everybody, future Chase here. I am just giving you a quick heads up that I had my mic settings a little weird during this next recording. We did our best to tame it down. But you'll still notice it's not as good a quality as we normally bring to the table and just wanted to apologize. 

But we thought the content was amazing. And we still wanted to bring you this episode. Thank you so much and I hope you enjoy the show.

Adam Kitain

Most people are going to make a decision in a very, very short amount of time, and probably only visit that one page. 

Chase Clymer

Welcome to Honest Ecommerce, a podcast dedicated to cutting through the BS and finding actionable advice for online store owners. I'm your host, Chase Clymer. And I believe running a direct-to-consumer brand does not have to be complicated or a guessing game. 

On this podcast, we interview founders and experts who are putting in the work and creating  real results. 

I also share my own insights from running our top Shopify consultancy, Electric Eye. We cut the fluff in favor of facts to help you grow your Ecommerce business.

Let's get on with the show.

Hey everybody, welcome back to another episode of Honest Ecommerce. 

Today, I'm welcoming to the show, an amazing CTO and co-founder of one of, if not my favorite, products to use as of late. Adam Kitain from Intelligems, welcome to the show.

Adam Kitain

Chase, thanks for having me. I am pumped to be here.

Chase Clymer

So, I bring up the product way too much on this show. But maybe if I don't know what the hell Intelligems is, tell me what it is. 

Adam Kitain

So, Intelligems at its core is a testing platform for Shopify Ecommerce brands. So that's hopefully the audience that your listeners are. 

What we also say is that we are a profit optimization engine so unlike a lot of other testing tools, we are focused a lot on revenue and profit as the outcome, not just conversion. And we'll talk a little bit about that and why.

Our background, Drew, who's my co-founder and our CEO, and I worked together in ridesharing and built the dynamic pricing algorithms and systems for ridesharing and a lot of what was powering that was A-B testing. And at the end of the day, what we were trying to drive was just higher revenue and higher profit for the ridesharing company called Via in New York. 

And so when ride sharing ended in 2020 due to COVID and they had to take a left turn,we were like, “Hey, where can we bring our knowledge of profitability and profit optimization to a space and have some friends running Shopify brands?” And now it's three and a half years later. And yeah. 

Chase Clymer

Yeah, I've been familiar with the product for quite some time. When Google Optimize went away, I think you guys made a big swing that paid off. And that's when you really landed on the map for a lot of consultants, I'd say. 

So I guess for me to explain it to the audience real quick, it's an amazing platform plugged right into Shopify and it allows you to test not only content, your traditional A-B test. So change some headlines, try this layout versus this layout. You can even do theme versus theme now. 

So from a content perspective.., but that was the last piece of the puzzle you guys brought to the table, which I thought was funny. The product started with a free shipping threshold. Was that the MVP of the product?

Adam Kitain 

Actually the first version of the product was like testing your product list prices. So one of our first customers was Shinesty. I'm in Denver, Colorado. They're also a Colorado based brand and they sell kind of fun party apparel like a three piece Santa Claus suit, Santa is like getting high or something..so like the kind of stuff that you would wear to a college party, I guess. 

So anyway, Shinesty they also sell boxers, and their thing is like the ball hammock and that's like one of their hero products these days. 

And so one of the first things like our first ever big price test, which was with them was like testing their boxer prices was like $24.99 versus $25.99 versus $26.99 for a single boxer. And that was where we got started. 

And honestly, that $1, I didn't expect any measurable effect from just that $1 price change. And it was tremendous. It was like 4% of conversion per dollar. I was like, if you were willing to pay $25.99, how could you not be willing to pay $26.99? But you know, a handful of people weren't. And so that was kind of like an aha moment for us. 

And so yeah, the first product was actually just testing prices. And then that kind of snowballed into testing your free shipping threshold, the shipping rates for different, you know, how much to charge for each rate, express and standard and things like that, discounts and offers. 

And so whenever people would be like, “Hey, can you also help me test my PDP layout or some sort of theme stuff and design?” We would always say, “Use Google optimize.” We're not going to compete with a free Google tool because we're just like a couple of folks over here. So we were trying to stay in our lane.

 Chase Clymer

And then Google did the Google thing.

Adam Kitain

Yeah. We talked to a bunch of different folks and like, why did Google do that? It's like, I don't have a great answer. I mean, maybe I don't know if you have any insight into that, but honestly it was, it was good for us. 

Because there were a lot of people that we were talking to like, “Hey, you guys do CRO, you should also try testing prices.” Some people picked up on that, but a lot of people were like, “Eh, I don't know. I'm not sure.” 

So we had good traction for the first two years and people, I would say early adopters. But when Google Optimize shut down and we added on, I mean, it was pretty straightforward for us to tack on, like, “Hey, test images.” And there was a cheaper sort of easier problem to solve. 

And so we tacked that on and have had a lot of like… I would say the middle part of the adopter curve has really started to latch on. So that's been really exciting.

Chase Clymer

That is me. I love the product. We use it a lot. It's a requirement now to work with us on a retainer basis. You need Intelligems. How else are we going to know that what we're doing is doing what we want it to do? 

Adam Kitain

That's awesome. 

Chase Clymer

It's such a cool tool. But we're not here to talk about the tool. We're actually here to talk about just CRO in general, split testing, and just maybe a primer for the folks listening.

Let's just talk about CRO basics, right? And I'm going to belabor this a bit. 

So conversion rate optimization, at the end of the day, it's a bit of a misnomer in the way that Ecommerce has claimed this one word to be synonymous with the percentage of people that make a purchase on your store versus visit it. 

But historically, conversions are an action on your site that you want the user to take. So if you have a restaurant, and you want people to click a button to call the phone number, if that's what you want, that's the conversion. And then you optimize for that result. 

But just with the rise of Ecommerce and Shopify and what they named some of their KPIs, which is the right KPI, but conversion rate optimization became synonymous with just working on raising the conversion rate, which is orders not true. 

Because there's other KPIs that you can optimize for, which one that we do all the time is average order value, or revenue per sessions, or profit optimization. It's still a byproduct of conversion rate optimization. So I don't know. That's my rant about the naming convention of this stupid service. 

Adam Kitain

Honestly, I think it's such an important point. And I think that to be hung up on conversion rate as orders divided by visitors is not the full equation. And because I imagine most people are listening, I won't do too much math, like audibly, but I'll do a little bit.

Which is like, we have a few metrics that we try to anchor folks on. Conversion rate is one of them, revenue per visitor and profit per visitor are the other two. 

Revenue per visitor is conversion rate times AOV. So orders per visitor and dollars per order gives you dollars per visitor. So that's like, how many dollars am I getting out of every visitor that lands on the site? And to your point, AOV is a really important part of that equation, not just conversion rate. 

And why is that important? Like a very simple example, imagine you have a set of users that land on your page and get no discount. And then like group B, you're offering them a 10% off discount. Obviously group B is going to convert better than group A if they get no discount versus 10% off. 

So the 10% off group is going to convert better and if you were to look at that on its face, then, and at just conversion rate, you should just give everyone 10% off and take that to an extreme, we should just give your product away for free. 

Obviously, we need to take AOV into account because what that 10% discount is potentially doing is like reducing your AOV by 10%. Then you start getting into some price elasticity math, but basically if the conversion rate times AOV in the second group is less than the conversion rate times AOV in the first group–so revenue per visitor in group A versus B–that's like, at the end of the day, I'm trying to get as many dollars for every person that I'm paying money to drive traffic to the site, I might as well like try and get the most dollars from them. 

That's revenue. 

And then profit is like taking that even one step further, like,” Hey, are they buying products that are high margin? Am I eating too deeply into my margin with this 10% off and things like that?” 

Chase Clymer

I mean, it's another cool thing about your product, but you can do this with any testing product to be honest. 

You have all the data, you can run all these different analyses and something that... We're running a test right now where we made a big swing and we were wrong, but we learned a lot of cool things, which one of them was... 

Adam Kitain

Interesting. 

Chase Clymer

We learned how to raise the AOV for sure. That wasn't the problem.

And so on the surface, it's like the test is losing, like we're gonna need to take another iteration of it. 

Adam Kitain

But can you share more details about the tester? 

Chase Clymer

Well, it's like a super complex one. And this is what we were talking about in Slack. But basically, we have a client that has a bunch of different variants there. Their old layout of their store was a bunch of variants as individual products. And then we combined all those variants. And in doing that, we know that it raised the conversion rate. That wasn't the hard thing. 

So we took it a step further and then switched up how we were presenting the offer and got rid of the lowest quantity available. And so we did a lot of things with this one test, big swings. We believe in bigger tests versus smaller tests, which is something we can get into a little bit later. 

But with this particular example, where I screwed up probably is the way that I presented the offer was a bit confusing. So we're now changing the copy. Not much to offer but I was like, “Hey, this is more profitable. The people that did buy this are 10 times more profitable…” o, that number is wrong, “It is noticeably more profitable in this group, even though it's failing the test.” 

So maybe that's a good segue. You can ask me follow ups about that. But that's probably a good segue into just the testing mindset. 

Adam Kitain

Yeah, I mean, I love the position of taking big swings. And also, not every test is going to be a winner but you're always learning things. 

Chase Clymer 

That's a new question that I ask with every conversation I have with clients about working together on a retainer. I'm like, “Hey, this will come out soon. We have a new offer where we'll launch your first test for free.” 

And then I ask, “How do you respond when this first test fails? What do you think of us? What do you think of the test?” Just to learn where people are at with the idea of testing. And not to bury the lead, but basically, not all tests are winners. 

Adam Kitain

Yeah. Actually we have some data on this, and this is a much higher percentage than I expected, but about 55% of tests are winners. Honestly, only slightly better than a coin toss. Do people end up with a like winning outcome on any given test? Which means that…

Chase Clymer 

And this is across Intelligems platform. 

Adam Kitain

Sorry. This is based on…We did this analysis pretty recently and it was across like 4,000 tests. 

Chase Clymer 

That's so cool. 

Adam Kitain

A pretty broad data set. And it means that people's sites are pretty optimal in that not every change that you do is going to create a more optimal version. But at the same time, there's a ton of room for optimization. If any one test only is slightly better than a coin flip of having a winner, but then by the time for the brands who have run three or more tests, 95% of them had at least one winner. 

So if you do this a couple of times, you're more and more likely to find something because you're learning things along the way too. And so it's a process. It's not just like, “Hey, I want to do this. I'm going to get a winner and I'm going to put it in the bag and go home.”

Chase Clymer 

It's a marathon. It's not a sprint. 

We felt so good about some tests before we did them. And then you get the data and you go, “Well, that's interesting. Why is it this? Why are we not winning?” 

And then you go and look again, now knowing it's losing and stuff becomes kind of obvious in retrospect. You're like, “Duh, this is why. How did we let this happen?” And so you quickly iterate and it helps you learn that much faster. 

And I think that mindset of anyone that's going to start either doing CRO themselves or work with a team that's going to do CRO for them is a testing mindset that a losing test isn't a waste of time. It's a learning opportunity. 

Adam Kitain

100%.  

And one of the things that we try and struggle with is how we position, like, okay, this test, the control group was the better. But how do we say that's still a good outcome? You learn something, we potentially avoided making a mistake that we thought was obviously better, but it wasn't for X, Y, and Z reasons. 

And then another interesting perspective is sometimes you have to dig a few layers into the data. Because a lot of times what happens is a test won't work. Someone is like, “Oh, this is obviously gonna be better.” And then it's not. And then it's like, well, my first reaction is, the data doesn't support my gut. And so therefore the data is wrong. 

That is one of the biggest challenges as an AP testing tool. Like, hey, I have to trust the data. And so one of the things that it's really important for us is having a lot of drill downs and even being able to like go export orders and like all the data to try and map it and reconcile it with Shopify. 

And I think that at the end of the day, I used to work in a data science organization at IBM and there was an executive that used to say, “I only trust data if it supports my intuition,” which was to me was so painful because I was like, “Well, sometimes your intuition could be wrong,” and that's exactly when you want to be able to rely on the data. 

And so I think for me and for Drew, one thing that was just really, really important is establishing trust in data and then being okay, let me... And there's like caveats, there's a bajillion caveats to that. But yeah, I don't know. I'll get off my soapbox for a second.

Chase Clymer

You need to trust the data and you need to just accept that every test isn't going to be a winner and be okay with it and understand that it's a learning opportunity. It's not a waste of time or resources or skill or whatever. 

But let's talk about how you got winners and losers. How the heck do we determine that? 

Adam Kitain

Yeah. And also, what makes a winner.

That’s a really important question we talked about, “Okay, if it's not conversion rate that we're just optimizing for, maybe it's revenue, maybe it's profit.” And what happens if revenue is up and profit is down or vice versa? 

And I think there's a lot of questions that you have to ask yourself, like, “Where am I in my business lifecycle? Am I early? Am I trying to get my product in the hands of more people and get it out there?” 

Let's say I have a new pancake mix that I'm really excited about, and I want to get out into the world and people are excited about it. So at the end of the day, if I run a test and I can see that it's driving more orders but like slightly lower profit for my store, maybe that's okay. Maybe my intention is to try and drive more volume or I'm going to have pancake mix expiring in my warehouse if I don't sell it soon enough. 

The goal is to try and drive more orders. The flip side is if I am an eight figure brand,the 10% hole that I had when I was a $500,000 brand is a lot more painful now that I'm a $50 million brand. And so I need to right size my profit. 

And at the end of the day, a lot of brands are in that situation today. I got to work on cash flow and I got to make sure that I'm spending a lot of money on marketing, but that's actually driving returns, not just volume and like Dave Rackooch has, I think it's Dave Rackooch who says it, maybe he's quoting someone else, but he used to say, he told me, “Revenue is vanity, profit is sanity.” 

And so that's why we focus so much on profit. 

But at the end of the day, there's so many different considerations to consider. It’s hard to say, but I think one of the important things to do to determine a winner is before you run the test, understand what is the hypothesis you're trying to test for and what is the outcome you're trying to achieve. Am I trying to drive volume? Am I trying to drive profit? Am I trying to drive total top line dollars or bottom line dollars or whatever it might be, and then really focus on that and also focus on the drivers of that. 

Like, “Is this working because of AOV?” “Is this not working because of discounts or not working because I lowered the price, but now a bunch of people are below my free shipping threshold?” There's just so many other factors that play here. So that's a really big part of it. 

Chase Clymer

Absolutely.             

Now, when you do run a test to determine a winner or a loser, there's this thing in the industry called statistical significance or STAT SIG for the nerds. What the heck is it? 

bYeah. So STAT SIG… and honestly, one thing I think we could help people do a better job, if I'm being totally frank, is understanding statistical significance better and incorporating that more into their decision making. 

So there's a couple of ways of thinking about it. Let's say you run a test and you're testing a new way of positioning your variance versus the old way of positioning the variance. And it says, this is now generating an uplift of 5% revenue per visitor per month. The one thing we can look at is statistical significance, which is what is the probability that that new way is actually better on a revenue per visitor basis than the control. 

And so what we do is we're measuring conversion rate, we're measuring AOV. Those are just measurements and the more data you get–you may be familiar with the law of large numbers– the more data you get, the more likely it is that the measurement that you have observed is representative of the largest, like the total population. 

And so as you get more data, the confidence, the width of the distribution of outcomes gets narrower and narrower. For revenue, what we're doing is we have a distribution or an interval for conversion rate. We have an interval distribution for AOV. We multiply those two things together, and then we end up with a revenue per visitor observed value and a distribution. 

Then what we're saying is, “How many points along the distribution is this challenger better than the control?” And that's where we get the probability to be better or the probability to be the control. 

So you and I may say, “Okay, it's 80% likely that group B is better than group A.” 

Two caveats. One caveat is 80% feels like a lot, or even 70 or 60%. Like, “Oh, that's like better than 50%.” But what you have to realize, first of all, is that the probability to be better, the baseline is 50. So 50/50 means that actually it's like a total coin toss. You actually have no idea. 

So once you get to 60/40, it might seem like, “Oh, 60%. It's almost passing.” But actually it's barely better than the no knowledge baseline of 50/50. 

So I think that people get skewed as the number or their perspective is a little bit shifted because it's easy to be like, “Oh, 70% or 80%. That's a passing grade in calculus and college.”

But here, that still means 80% confident means–one in five times what we're saying is–the winner is not actually the winner. So that's one caveat. 

The other caveat is it's just the probability to be better. Not to be specifically 5% better, it could be 2% better or 1% better and still be better. And so there's a range of outcomes on the measured value. 

And so we're saying, “Oh, it's 5% better, but actually, that could be 2%, it could be 7%. 5% is just sort of the expected based on what we're measuring.” 

There's a lot of… this is getting a little whizzy but there's a lot of considerations of how much better it is and how likely it is to be better.

And at the end of the day, you also have to think you can get more data and you could let this test run longer and that will help increase the statistical significance and get you to a higher degree of certainty.

But you also have to think about, “Well, what's the downside? What if I'm wrong? What if I do this thing?” There's a four out of five chance that it's better. 

And in one out of five chances that it's not, you think, “Could I go roll this out and then go on with my life and try the next thing?” 

And that's why one of the things that we talked about before was taking big swings. And maybe this is just not the big swing, maybe this is probably better and let's just move on to the next thing. And so time is also an important resource of yours that you can't get back. Also a factor to consider. 

Chase Clymer

Yeah. I think that once the test gets to a point where there's arguably a deterministic winner, move on with your life to the next one. Because it's all about these incremental engagements. Incremental increases. 

The 1% better is going to change your business. If you can get 1% better every day, every week, whatever, that's going to change the trajectory of your business over time versus one thing that you know is right that does a 5% increase or whatever. It's those little wins repeated over time that are really going to change things. 

And yeah, you're right. It's like, you also can't test a bunch of shit at the same time. You can do a test where there's a lot of changes at once, but you can't do multiple things at once. 

You can do a few different variants, but it gets to a point where it gets lost within itself within testing. It's like you're testing too much and you don't actually know what's going to fix these things. So you need to roll it back and get a little more method to your mayhem. 

Adam Kitain

Totally. Yeah. 

And that's why there's this overlap between art and science here. And also something that we were talking about before we hit the record button was there's a special kind of person that needs to wear this CRO hat, right? 

You need to have a little bit of an artist sense. You also need to be a little bit of a math geek. And I think that you need both because at the end of the day, the data can tell you a lot of stuff, but you can also slice and dice the data in a lot of different ways and get the data to tell you a lot of different stories.  

I think there's both the opinionated, artistic side, the ‘I understand my customer and what works well, and I also understand the dynamics of interaction on an Ecommerce site’ with also this ‘Let's get geeky and get into the weeds of AOV times, margin times conversion.’ 

Chase Clymer

Yeah, it's like the hat that a conversion rate optimist would wear… or an optimizer. I don't know what it would be. 

Adam Kitain

An optimist. 

Chase Clymer

What a nerd is going to do is you're going to have some knowledge on UX patterns. You've got to have some knowledge on copy. You've got to have some knowledge on offers and positioning and benefits versus features. There's a lot of stuff you need to know. 

And then you have to have the ability to challenge your own beliefs about the product or just what you are reading on a page. And it's like, you read something and you take it as fact, and then you go “Well, what if I don't know this brand? I don't know this product. Would I believe this statement? What do I need to see to believe this statement?” 

Because a lot of what you're doing is... All conversion rate optimization is like this philosophical argument of helping you help the buyer understand that the product will solve their problem. And so you're trying to get rid of any fear, uncertainty, or doubt around that purchase, which is more on the content testing side of things. 

If you're going to be changing UX and whatnot, that can make some swings. But copy and content, oftentimes, are what we're going to get you a better outcome from a CRO perspective.

But yeah, now we're just getting into the weeds on the psychology of how you should think about tests and stuff like that. 

Adam Kitain

Yeah. I mean, I think that psychology is a really important thing. I wish I had a degree in psychology or took more classes in college because even when I think about…for me as a founder and when we were in the very early days, people would always be like, “Why do people buy your product?” I’m like, “I don't know.” 

And there's this concept of people buying your products to make more money, save money or time, or because there's some sort of social norm or pressure to do so. And you have to understand, for an Ecommerce brand, you're doing the same thing. You're trying to sell a product and why is it that people want to buy this product? 

What need is it satiating, or what fear is it? Is it abiding? And so I think that psychology is a huge part of this. And it's pretty interesting. 

Chase Clymer

Yeah. I got some book recommendations now because you made my brain go there. There's a book called Predictably Irrational by Dan... I can't remember his last name. It starts with an A. That one's really cool. It gets you into this weird stuff about consumer psychology and stuff. 

There's another one just called Irrational, or Influence, the power of psychology on purchasing or something like that. Influence is big on the front cover. 

And then one that's just straight up about optimizing websites is called Websites That Win. So if this is something that you're into, those 3 books are probably... The first 2 are definitely a little easier to read. The third one is just nerdy and fun. But those are some avenues that you could go down. 

Before we end though, what kind of tests have you seen that work? What are people doing with the product? Let's talk about some real nuts and bolts here. What are people testing? 

Adam Kitain

Honestly, so many different things. If I go back to our roots for a second, price is such a huge lever that I think is under touched and underserved. 

Chase Clymer

Yeah, I think people are scared of it. 

Adam Kitain

People are super scared of it. And one of the things that we try to do... But it's such a huge lever and it's so tremendously important. So it's price. 

And then also I think that the free shipping concept and I think Amazon has done a disservice to a lot of consumer mentality and also created a lot of fear in brand owners like, “I have to offer free shipping for a large chunk of orders or I'm going to lose.” 

But I don't know, that's something that we have found to not necessarily be the case, or that at the very least the free shipping threshold is a really great driver to push AOV.

You end up trading off AOV with conversion rate. And so that's where it gets a little tricky. Higher threshold can lead to higher AOVs, but also potentially lower conversion rate. So this is also why it is hard to form a strong intuition, but anything with those kinds of dollars and cents can be tremendously large swings we're pumped about. 

But also, I love it when people totally think about the funnel, the flow, the education. I think one thing… I was talking to someone who has a CBD brand and he said, “I can't sell on Instagram marketplace and Facebook marketplace, or not marketplace but Instagram shop, because I'm a CBD company.” 

However, I don't want to sell on Instagram shop because so much of my product and so much of what I'm optimizing for and what they've tested for and found really moves the needle is around education and brand education. About the product education, about how to use it, what it's good for. 

And so actually making sure that they have the buying funnel and the buying journey, the right balance of content and education while also making it not too much friction to get from a new user to landing on your site to buy in it. I think for him, that was what he said really made a huge, tremendous difference. And not just on boosting conversion but overall–to his bottom line and to sort of the long-term feasibility of his business. 

So anyway, I thought that was really insightful. 

Chase Clymer

Yeah. On our side, we've got a lot of different tests running. But obviously, one that I think a lot of people should do is if you do have variants of your products for like, say flavors or quantity discounts, and you have those separated as individual products, test individual products versus a single PDPE with those as options and variants. 

And using Intelligems, it was pretty straightforward to do that with a theme test. And I can't even tell anyone how to do that because it is a little more like 300 levels of how to set that test up. But that's one you should probably run.

Navigation architecture is like... People put dumb stuff in their navigation. And that's a really easy test to run if you're on the content side of things. 

I think that once you have your flagship product and maybe your flagship offer sorted out for a more established brand, oh buddy, you get that thing built out on 2.0 on Shopify and you can just start trying new things with Intelligems, new copy, new positioning, new offer. It just makes things so easy. 

So we've got a brand we work with that's an as-seen-on-TV brand. They're sending crazy amounts of traffic to one landing page and we're always just trying new things. It's really interesting to see the results. 

Adam Kitain

Yeah. I mean, landing pages in general, you have like 45 seconds to… most people are going to make a decision in a very, very short amount of time and probably only visit that one page. 

That is like a tremendous opportunity of when someone lands on your site, what are the first things that they're seeing? What are the first things that they're learning and understanding about your product and your brand? They're making that decision really fast. 

Not in all cases, of course, there's buying furniture and that's a considered purchase and you might like to spend a lot of time doing your research about what couch you want to buy. 

So not every brand is the same. But definitely, it's something. When someone lands on your site, understand what it is that you want to show them first. That would be a great place to start. 

Chase Clymer

Adam, it's been so wonderful having you on the podcast today. Obviously, I'm a big fan of the product. Everyone go check out Intelligems. If you want our help figuring it out, please reach out to the agency. You're going to see a bigger partnership between myself and Intelligems coming to fruition here in the next few months. 

What else are you going to leave our audience with today? Where should they go to check out the product, learn more about you? What should we do? 

Adam Kitain

I mean, Chase, thanks for having me on. This has been a blast. I love to jam about this kind of stuff. And hopefully we can do this again in the future. Check out Intelligems at intelligems.io. You can also find us in the Shopify App Store

Yeah. Shoot us questions because we have a great team of folks who have a lot of great ideas, and shoot Chase questions because he's an expert in this stuff. I love just jamming on this stuff. And I love when people get engaged and start sharing ideas with each other. So that's awesome. 

Chase Clymer

Awesome. Thank you, Adam. 

Adam Kitain

We can't thank our guests enough for coming on the show and sharing their knowledge and journey with us. We've got a lot to think about and potentially add into our own business. You can find all the links in the show notes. 

You can subscribe to the newsletter at honestecommerce.co to get each episode delivered right to your inbox. 

If you're enjoying this content, consider leaving a review on iTunes, that really helps us out. 

Lastly, if you're a store owner looking for an amazing partner to help get your Shopify store to the next level, reach out to Electric Eye at electriceye.io/connect.

Until next time!