More is not always better when working to raise SEO ranking. Too many duplicated keywords can drag down a ranking while making a change to just one keyword can drastically bring it up. Too many pages can get Google to block their crawling, and too little traffic will result in few to no conversions. Nick Eubanks, founder and CEO of From the Future gives us insight into the delicate balance of getting it “just right.”
Please tell me a little bit about your background before you founded From the Future.
I have previously built and sold two software companies. I was working at an e-commerce company called Traffic Safety Store for two years when I founded I’m From the Future. I was doing a little bit of consulting and selling some info products from affiliate revenues, so it was created strictly as a tax shelter. It was never supposed to be an agency; it was never supposed to have clients, it was never supposed to have employees, this was all completely by accident, a happy accident! I left the Traffic Safety Store in October of 2017 when it had grown to 20 full-time employees, so, I’ve only been full-time at my own company for a little over a year.
The name was never a well thought out plan, so I just ran with it as best as I could. We had lots of discussions about renaming the company and ultimately decided to leave the brand in place to embrace some of the heritage of the accident that came to pass. However, after acquiring Webris and True Voice Media in January 2018, we had over 30 people and, well, it probably didn’t make any sense to have “I’m” in the brand name anymore, so we are now I’m From the Future.
Your site says, “we specialize in marrying technical SEO and user experience.” What does that mean?
What sets us apart from pretty much everybody is that we understand integrated digital marketing for bringing qualified traffic AND can design beautiful pages that rank well. Therefore, we love being involved in the process of planning and designing pages and experiences as early as possible. Frequently, however, that’s not possible, as companies that are doing billions (with a B!) of dollars a year in revenue are well established before they come to us. In those instances, we’ll go in and find ways to redesign pages and change things such as HTML structure, where the page lives in the document tree, how the page elements are being rendered; all the things that are important for SEO ranking.
Sometimes there are several pages whose redundancy is cannibalizing relevancy for its keywords. Based on our review of the analytics, we might need to combine them into just one page. Or we may find instances where the search engine results for a two-word keyword and a three-word keyword using the same two-word keyword with an additional modifier, are different. For instance, if the results for “Men’s Shoes” are drastically different than the results for “Men’s Shoes Coupons” – it may make sense to maintain separate pages.
What is marketing automation?
Have you ever been on a website, taken an action and later received an email? Receiving a follow-up email from a company after abandoning items in a shopping cart is a pretty common example. So, let’s say you are already an Amazon customer and you view a product. You’ll then receive an email with suggestions for not only that product, but similar ones, or when the price drops or if availability or inventory changes. Those are all automated triggers that exist as relationships within the database as if/then statements so when users take certain actions, certain results happen automatically.
Marketing automation can be very useful for software, especially for enterprise software companies, whose contract might be a hundred thousand dollars. It’s a long buying cycle as it’s a large investment. So, a potential customer visits their website and will enter their email address to access a white paper or a case study. The next day they’ll receive an email with a bit more information, then three days later they’ll get another email, and then five days later they’ll receive an invitation to a webinar. After attending the webinar, they’ll receive a follow-up email from a salesperson trying to arrange a live demo. All of that, even when it does feel like humans are actually taking the time to review individual customer activity and send those emails, it’s almost always automated, or at least it should be.
When creating a social media strategy, are there some social media platforms that perform better than others for specific industries?
I think you could probably make some educated guesses. If you’re a consumer brand, spending money to promote your video on Facebook is probably a good choice that will likely not be as effective for selling B2B services. I’m making big blanket statements, but if you’re a B2B, you probably want to start on a platform like LinkedIn or in business communities and forums, or other places where the B2B audience exists.
On the other hand, certain channels that work really well for some brands in an industry may not work for another brand in that same industry. To understand what’s working now takes a great deal of activity and simultaneously testing different campaigns s across multiple platforms. I think the smartest thing a company can do is to spend little bits of money across as many channels as possible, testing them all, sort of shotgun style, and immediately closing down channels that are not performing. I think way too many companies spend way too much money to get organic and paid social media results on platforms that are just not going to work. They need to be willing to deprecate those sites and invest more heavily into the sites that are driving results in the short term. If at some point they’ve created surplus returns from the channels that work, then they can go back and reexplore any channel they want.
Where does conversion optimization start?
We have a UX department with a head strategist and dedicated practitioners underneath him, which is really cool. Usually, we will run a UX audit to identify a lot of small changes that can be made pretty quickly. Interestingly enough, more often than not, it’s usually removing things. Most sites tend to overcomplicate their conversion funnel or don’t even realize they have a conversion funnel at all, so they just throw the same form up on every page, and there’s no “branch logic” and no progression as to where and when the form appears. Fixing this is simple once you stop and really think about how the conversion experience should work from a user’s perspective. So that’s another win that can happen very quickly.
Where CRO (Conversion Rate Optimization) gets tough is when sites don’t have much traffic. So, a site that has, let’s say, less than 10,000 visits per month and is not willing to spend money to acquire traffic., will not get statistically significant results at any meaningful confidence interval.
Let’s talk about how you were able to get KicksUSA a 97.31% increase in organic traffic.
Well, the first thing we cleaned up was their architecture, which was a mess. It was completely flat, and there was no real rhyme or reason for the way they were creating categories. None of it was based on product variations, and there wasn’t any keyword strategy behind their category names and product architecture.
Then we went through the site and found a situation that’s very common in e-commerce sites with lots of products and filtered searching. When product options can be selected to filter down results, a lot of times those filters are being driven by changes in the URL but there’s no material change to the meta attributes or the page, so it creates this massive crawl bloat. A site that has maybe 10,000 products with a hundred different product variations, suddenly has a million URLs that Google has to crawl. That’s going use a whole bunch of Google BOTS bandwidth and that costs Google more money and Google doesn’t like to cost Google more money! So often, Google will simply block lots of pages, even if not from being indexed in Google, at least blocking them from being crawled at the server level.
What can be done to prevent “crawl bloat”?
Paying attention to crawl budget becomes increasingly important and blocking pages that should exist for users but should not be being crawled by search robots can result in large, immediate lifts to organic traffic.
To assure your crawls are as efficient as possible, you can use things like no index tags or block certain pages from being crawled completely by creating a robots.txt file. Robots.txt is a file that Googlebot, or any bot for that matter, hits before it crawls a site, and contains directives to the robots as to how you want them to crawl the site. This really trims downs the way a site is being crawled and can make a huge difference, especially on larger sites with more than 10,000 pages.
Gail is a Technical Editor and Interviewer for Website Planet. Her first PC was a TRS-80 which required a cassette tape to boot up. Producing websites, emails, and banners combines her love for design, technology, and of course, writing.