Friday 30 September 2016

SearchCap: Penguin & link building, PPC leads & social

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: Penguin & link building, PPC leads & social appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

What’s new and cool at Google from SMX East 2016

At this year's SMX East, Googlers Jerry Dischler and Babak Pahlavan shared recent updates and what's coming to AdWords and Google Analytics. Columnist Mark Traphagen was on hand to cover the highlights. The post What’s new and cool at Google from SMX East 2016 appeared first on Search Engine...

Please visit Search Engine Land for the full article.

Up close at SMX: Using paid search and social together

From left to right: Pamela Parker, Executive Features Editor, Marketing Land & Search Engine Land; Tara Siegel, Senior Director of Social at Pepperjam; Maggie Malek, the head of social at the MMI Agency; and Sahil Jain, CEO of AdStage.

From left to right: Pamela Parker, Executive Features Editor, Marketing Land & Search Engine Land; Tara Siegel, Senior Director of Social at Pepperjam; Maggie Malek, the head of social at the MMI Agency; and Sahil Jain, CEO of AdStage.

No news flash here. Marketing teams cannot afford to exist in silos.

Paid search and social are no exception. You can amplify the reach of both of these channels by combining your efforts and leveraging data from each together.

In the SMX East session, “Using Paid Search and Social Together,” three speakers, Tara Siegel, Jahil Sain and Maggie Malek, shared their top tips for winning with paid search and social.

Using search techniques to win at paid social by Tara Siegel

Tara Siegel, the Senior Director of Social at Pepperjam, said she is on a mission to make people understand that social must be viewed holistically. Social is an omnichannel optimizer.

It’s very important to be consistent across channels. Deliver the right message to the right people, Siegel explained.

tara-siegel-right-message-600x301

In social, campaign goals will guide ad type and strategy. The same is true for search, with ad extensions, click-to-call, lead generation and so on.

[Read the full article on Marketing Land]


Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.



Why call tracking helps improve PPC lead generation account performance

smartphone_analytics-366588536-ss-1920

Many businesses receive significant lead volume from phone calls. The reason marketers want to generate phone leads is to capitalize on the immediacy of being able to activate the sales funnel.

This article delves into why lead generation businesses need to have a call tracking solution in place, how data collected through call tracking technology can improve conversion funnel performance and why integrating call tracking into third-party systems can lift paid search performance.

Why do lead generation businesses need call tracking?

Lead generators run into a blind spot when trying to assess the value of their paid search campaigns. While it’s easy to track web-based conversions, phone leads generated through a single “catch-all” phone number can’t be tied back to a specific source or keyword.

Optimizing accounts with incomplete information leads to poor outcomes such as pausing campaigns, reducing keyword bids or removing marketing sources that could be providing value through phone conversions.

Call tracking provides visibility into total account performance via use of tracking phone numbers. These tracking numbers identify a phone lead’s marketing source (e.g., Google or Bing) and the keyword that specifically drove that phone conversion.

Having this additional information on hand better informs key decisions such as whether account structure needs to be altered or budget allocations shifted between campaigns and sources. For instance, analysis of data from a call tracking solution can lead to expanding a PPC account into new campaign types (such as call-only campaigns) or optimizing an account’s ad messaging to include “call us today” or similar call-to-action messaging.

Improving the conversion funnel

Generating leads is only half the battle for lead generation marketers. The leads generated need to convert into paying customers to justify the outlay of marketing dollars. A call tracking solution can also bring specific information to paid search marketers about the sales funnel that can be optimized. Here are a few solutions to consider.

  • Automatic phone routing. Provides the ability to set specific rules and criteria to take inbound calls and automatically route them to a salesperson in real time. Immediately routing phone leads to a salesperson or call center reduces lead aging and increases the probability of converting that lead.
  • Phone call classification. Variations of this feature can be used to automatically classify phone leads as good leads or bad leads. Furthermore, automatic classification of phone calls can help determine whether paid search traffic is truly driving sales-related calls or support calls. Leveraging this information can help optimize PPC campaigns to ensure high-quality, sales-oriented leads are being generated and that every marketing dollar is optimized for maximum return.
  • Call transcripts. Analyzing conversations between customer and sales representative is one of the best ways to both optimize the back-end conversion funnel and to uncover new keyword lists and audiences that target qualified, top-of-funnel prospects. One of the most effective PPC (and overall marketing) strategies is to optimize and target based on what your current customers are telling you.

Strategically speaking, call tracking solutions provide the means to create a “closed-loop” PPC marketing strategy. Simply put, top-of-funnel data can be used to optimize the back of funnel, and back-end funnel insights can improve how the top of funnel is targeted.

Integrating into third-party systems

Most call tracking solutions offer the ability to integrate into a variety of CRM, advertising and other platforms. The ability to integrate call tracking provides more complete insights and enhances the ability to optimize your PPC program. Some key integrations revolve around:

  • CRM. Integrating call tracking into a CRM system allows for the ability to create records from phone leads that can be managed and tracked through the sales funnel.
  • advertising platforms. Integrating call tracking into platforms like AdWords or Bing Ads further guides marketers regarding how to best create and optimize paid search campaigns.
  • bid management. Integrating into third-party bid management platforms increases the effectiveness of their technologies. For instance, feeding call conversions into their systems allows for creation of call-specific bid rules and also provides the additional data needed make specific bid algorithms like CPA or position-based bidding work more efficiently.
  • conversion rate optimization. Call tracking integrated with CRO technology provides deeper insights into testing experiments and can also help determine new testing ideas. Call conversion tracking embedded within CRO tests more accurately determines the success of a particular landing page or set of pages.

Final thoughts

Call tracking provides marketers the information and functionality needed to optimize both the top and bottom ends of the conversion funnel. Gaining visibility over phone lead performance and fine-tuning lead generation efforts will lead to better paid search and overall business results.


Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.



Authority & link building with real-time Penguin

Link building and giving Google what they want

So it happened. Google finally released Penguin 4.0 — the last Penguin update of its kind, as it now processes in real time as a part of Google’s core ranking algorithm.

In this post, I want to take a look at what Penguin is, how this update affects the SEO community as a whole and how the brave and the bold can continue to safely improve their organic visibility without fear of repercussions from punitive search engine algorithms.

The announcement

After a few weeks of turbulence in the SERPs, the announcement that many had predicted was finally made.

The Penguin 4.0 announcement had two key points:

  1. Penguin is now running in real time. This is really good news. There are lots of folks out there who have paid the price for low-quality SEO yet are still not seeing a recovery after removing or disavowing all of their spammy backlinks. Certainly, a house built on dodgy links will not spring back to a position of strength simply by removing those links; however, there are many businesses out there that seem to have been carrying algorithmic boulders around their digital ankles. Hopefully, these folks can now move on, their debt to a punitive algorithm update paid in full.
  2. Penguin is now more granular. This element is a little more curious, in that Penguin 2.0 seemed to add page-level and keyword-level penalties, making it more granular than the 1.0 release. However, we can only imagine that things have got much more advanced, and possibly individual links are considered rather than the seemingly aggregate approach that was taken historically. Only time will tell the degree to which this granular approach will impact sites, but I suspect it will be a good thing for those looking to play by the rules.

It will also be interesting to see how this fits in with the other 200 or so factors or “clues” that Google uses to rank websites. We now we have both Panda and Penguin integrated into Google’s core ranking algorithm (though Panda does not run in real time), so it’s possible that the weight of the various known ranking factors may have changed as a result.

One other interesting nugget is that there will be no more notifications for Penguin updates. Penguin now constantly updates as Google crawls the web, so tweaks to the finer points of this system will no longer be announced. Personally, I think is a good thing — folks can concentrate on doing good marketing (and SEO) rather than nervously waiting for the hammer to fall on some overused link-building tactic.

Links are still important

It’s important to remember that links are still important. Google has clarified this a number of times now, with Googlers such as John Mueller, Gary Illyes and even Matt Cutts clarifying the importance of links as a ranking signal, while also often warning of the problems of focusing on just links as a marketing and SEO strategy.

Of course, if we can step back a little, this makes perfect sense. If you have a simple five-page website, no corresponding social or PR noise, and 5,000 links… something does not quite add up there. Why would people cite that resource so widely?

On the other hand, if you have worked hard on your site and have a hundred or so great content pieces, solid reputation signals and 500 or so editorial links with wildly varied anchor text spread across the web with no discernible pattern, then this does look a little more natural. Add on some PR and social activity, and we start to see a pattern that looks like a real business.

So links are important — critically so for SEO. But links are one of many factors and should not be looked at in isolation. In fact, great links should often be the side effect of great marketing. So take off your reverse-engineering hat and put on your smart marketing hat, and you are moving in the right direction.

What Google wants

I always find it useful to briefly analyze Google’s recommendations. You can be sure those press releases and webmaster guideline pages are carefully worded, and often we can derive more clarification of what is needed.

  • From the Penguin 4.0 announcement: “focus on creating amazing, compelling websites.”
  • From the Link Schemes page: “The best way to get other sites to create high-quality, relevant links to yours is to create unique, relevant content that can naturally gain popularity in the Internet community. Creating good content pays off: Links are usually editorial votes given by choice, and the more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it.
  • From the Webmaster Guidelines: “Think about what makes your website unique, valuable, or engaging. Make your website stand out from others in your field.”
  • From the original Penguin announcement: “focus on creating amazing, compelling web sites.” This is the same statement again, so they must mean it, folks!

There is a common thread here: quality. Whether it’s website quality, link quality or content quality, Google clearly wants to drive this point home.

“The best way to get other sites to create high-quality, relevant links to yours is to create unique, relevant content that can naturally gain popularity in the Internet community.”

That statement says it all. The only problem here is that Google is often a little (wildly?) optimistic. Creating great content often is not enough on its own. You have to let people know about it.

You have to build relationships with folks who may be interested in what you have to say. You want to build relationships with other bloggers and website owners. You will then want to look at ways to use these relationships to start building the kind of links that will move the needle.

High quality, in Google’s eyes, means making your site valuable to your target audience. Create something that is really, truly helpful — then let people know about it. Don’t do this the other way around and start building links in volume where there is nothing of value to link to.

This, in a nutshell, is the problem with most link-building efforts — they are tackled completely back to front. Links are built before on-site value has-been created. The solution to this is simple: Start with your site. Build something of value. Then layer your link building over the top.

Link building tactics

The following is a brief overview of some link building tactics that still have merit and are based on the thinking above.

  1. Basic prospecting. Using a range of advanced query operators, you can often find resource pages or even (shock, horror!) highly ranking and well-maintained directories that are relevant to the product or service you provide. The more content you have on your site, the easier it becomes to find sites that are linking to similar resources and that will consider linking to you. Search for your keywords +resources, +links, +directory and other terms that indicate a relevant resource. Then do the requisite research. (More details.)
  2. Competitor research. Often, reviewing the links your competitors have will reveal some sites that will consider linking to you or your content. Again, make sure you have something of value before requesting any links — and remember that just because a site links to your competitor, it does not mean that link is helping them rank. Think quality. See some smart thoughts on how to do (and not do) competitor research the right way here.
  3. Guest post prospecting. Guest posts are a still a great way to generate exposure for your business and tap into a site’s audience. Remember, though, quality must come first. Likewise, if you have an opportunity to link to a piece of content within the body of the article and it adds value, then do it in a natural way to get an editorial vote within that article. I would tend to look for blogs in your space initially and manually review whether they have guest authors. You can also prospect, again on Google, using search strings like “Keyword” + “guest post,” “keyword” + “write for us,” “keyword” + “contributor” — I favor this approach rather than tools, as the sites returned highly are likely to be authoritative.
  4. Content + outreach. Once you have a bedrock of great content on your site, you can find sites that link to other articles and then go about contacting the owners to see if they will link to your content. Ideally, your content should improve on what they already link to so that link can be swapped out or yours can be included in addition to the original link. The Skyscraper Technique can work well here; however, it is not perfect for every situation.
  5. Broken link building. This is similar to #4 in many ways, but you are looking for broken links on sites you have identified as potentially providing a valuable link (or vote) for your business. You may well need to create some content to fill the gap when you find an opportunity, but this approach where you are helping the site owner and providing a simple alternative can yield great results. Identify sites you would like a link from, then crawl those sites with Screaming Frog or Xenu Link Sleuth to find opportunities. Lots of legwork here, but you can find diamonds in those 404s!
  6. Local organizations. With local businesses, I like links that help tie you into that physical location. Often you will be able to find clubs or some such that will accept some sponsorship in return for a safe, branded link on a page of their site. Play it safe here; do this for the right reasons, and you can generate some solid ties with local businesses, leading to more eyes on your business and some great local links.
  7. Press and PR. Further to having an amazing site, doing great things in the real world can also have benefits in getting exposure in the press and relevant publications. This will, in many cases, generate a link back to your site, again helping you build trust and relevance.

This is not meant to be an exhaustive list, and really, I don’t like to attempt link building from a tactics perspective without having a clear and unique strategy tailored to each business. We see tactics that deliver great results for one business completely fail to deliver for another.

This game is all about determining what is right for you and adding links to your site that enrich the web and make the linking page a better place. Of course, to do this, you have to focus on ensuring your site is the very best it can possibly be so the linking site is improved by the link to yours.

Final thoughts

The best SEO often comes down to common sense. Spammy directory listings did not make sense. They were there purely for SEO. This backwards approach meant many sites were top-heavy with links with no content. All that time and effort spent, and no real value added to the site.

I talked about the psychology and history of this in a post on my own blog called “Ass Backwards Link Building” that really dives into how search engines work, the mentality of many low-end SEO agencies and how their practices are directly out of alignment with Google’s own “give, give, give” mentality.

Sure, Google shows us ads. Lots and lots of ads. But they gave us free access to the world’s information. In my mind, that is a good trade-off.

Unfortunately, though, we live in a world where we have folks looking for a cheap SEO solution, and there will always be some provider who will fill that gap — the demand for cheap SEO creates cheap SEO. Around and around we go… unless, of course, Penguin 4.0 finally puts pay to risky, low-value tactics.

I sincerely hope that Penguin 4.0 puts and end to the often daft link-building tactics of the past. Penguin may well need some fine-tuning, but link building in 2016 and beyond will mean tackling your website first — building something great, and then letting people know about it.


Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.



SEJ LIVE: Anne Ahola Ward & Bridget Randolph on the Future of SEO, Mobile Search by @wonderwall7

Search in Pics: Google & YouTube cake, pumpkins & DJs

In this week’s Search In Pictures, here are the latest images culled from the web, showing what people eat at the search engine companies, how they play, who they meet, where they speak, what toys they have and more.

GoogleBot at the AngularConnect conference:

googlebot-angular
Source: Twitter

DJs in suits at a Google partners party:

google-djs
Source: Twitter

Google’s 18th birthday cake:

google-birthday-cake-18
Source: Instagram

Google pumpkin:

google-pumpkin
Source: Instagram

YouTube cake:

youtube-cake
Source: Instagram



Aim for the Stars: Creating Truly Killer Skyscraper Content by @IAmAaronAgius

If you feel like it has become increasingly difficult to capture your audience’s attention and traffic is lagging despite your rigorous content schedule, then you’re not alone. The way audiences consume content and what they want from it continues to change. Audiences are demanding more value, and search engines like Google have no intention of giving your content top billings in the SERPs until that value is proven.

Don’t forget about the growing competition from other brands that are cranking out more content and flooding the already crowded pools of web content in nearly every industry imaginable.

Although 60% of marketers struggle to create engaging content, the majority (over 75%) of marketers still plan to produce more content this year. That trend isn’t likely to change, either.

But if your existing content isn’t generating traffic, and repurposing your best stuff isn’t doing much to move the needle, what other option do you have when it feels like your content campaigns are failing?

Before you throw your budget into paid promotions and sponsored posts to gain visibility, you should try the Skyscraper Technique.

What is Skyscraper Content?

The Skyscraper Technique is an approach to creating content for the purpose of producing a large volume of quality backlinks, all organically developed and completely on the level. This system was devised by Brian Dean over at Backlinko and he has shared a few case studies that show just how effective this method is:

image03

While the primary intent is to create backlinks, it’s also a foolproof method for generating a lot of organic traffic within a short span of time.

The major benefit of creating a surge of relevant traffic is that a percentage of those visitors are going to stick around. Also, the content is so well-shared that you’ll generate a ton of new visitors who will continue to return to your site.

So, what makes Skyscraper content much more effective than what you’re already producing?

In this case, you’re sourcing topics that are already pretty popular, including already-published content with a large volume of shares. You take that content and inject as much value as you can to fill in the gaps, creating a more comprehensive and engaging piece.

That high-value content is then promoted out to a very specific audience to maximize its reach within a short time frame. The result is an explosion in referral traffic and massive gains in quality backlinks.

image01

Why Skyscraper Content is So Effective

Think about the last time you had an intense craving for a specific type of food. When you finally got your hands on it, the feeling of satisfaction and relief was overwhelming. You fulfilled an intense demand for something, which impacted you on a psychological – and probably emotional – level.

 image12

That’s one of the reasons this technique is so effective. There is already a demand for the type of content or topic. Tens of thousands of people online already indicate interest in the topic by the volume of shares. It’s clearly something they want, and if they already love it that much, imagine how satisfying it would be to get even more valuable content on the same topic.

This is the other reason why the Skyscraper Technique is so effective for racking up traffic. Rather than guessing what might work with your audience, you already know what they want. They have been primed and in turn, they’ve shown their cards.

You know exactly what is going to excite them, so it should be easy to create something that will blow them away.

Lastly, because of how well the original content was shared, you know that by creating something better, you will be able to target key publications among the distribution channels and audience to get massive exposure.

The more shares you get, the greater that spike in organic traffic and backlinks will be.

Creating Awesome Skyscraper Content

Producing top-notch content that delivers the kind of results seen in Backlinko’s case studies might seem like a deeply-involved process. However, it doesn’t take much more time than you might otherwise spend creating content from scratch.

In one case study, Brian revealed that it only took about six hours to create a single piece of content that produced over 17,000 visitors in one day.

 image05

There’s a simple formula to follow to make this technique work for you:

1. Source the Best Content Opportunities

There are several ways to find hot topics that your audience is searching for, and some of these ways require a little more legwork than others.

  • Watch the communities: Tune into larger, more popular groups and forums where your customers spend their time, including relevant subreddits on Reddit.com. Find the most upvoted content or the top discussions. Watch for articles that are heavily shared and transform the hottest discussions into content ideas for your site.
  • Check aggregate sites: Sites like Alltop not only list what’s new from major publications, but they also show you the most popular stories that are trending or featured within any given industry.
  • Watch competitors: Keep a close eye on what your competitors are doing, especially the well-entrenched incumbents. Doing this will make it easier to pinpoint when some of their content gets picked up and carried by followers.
  • Use BuzzSumo: This is the easiest method to source the top content of all time for any given topic. Plug in your search term and BuzzSumo will return a list of articles ranked by the volume of shares.

 image07

2. Create a Distribution Plan

Every content strategy needs a distribution plan. Even the best content won’t perform if it’s not promoted effectively. The first step is to build your foundation on who shared the original content.

In BuzzSumo, you can view the backlinks and sharers then make a list of the most prominent and influential people and publications. That’s who you’ll target with your promotions.

Also, consider what other channels you can use to promote your content. These could vary based on the format of your content, but be sure to check out LinkedIn Pulse, Medium, SlideShare, Reddit, StumbleUpon, and targeted communities where your audience spends their time.

 image13

3. Build a Better Mousetrap

The approach is quite simple here: take the original content and find ways to make it better. You don’t want to copy it, but you certainly want to take a topic that people loved and reproduce it in a way that is 10x better than the original:

  • Give it a new angle
  • Update statistics and data
  • Flesh it out to add more value
  • Improve the visuals
  • Fill in any gaps to make it a more comprehensive piece

Alternatively, you could produce something better in a different format than the original. For example, take a short podcast or infographic and create an amazing, in-depth blog post on a similar topic.

4. Publish and Start Promoting

Once your content is live, it’s time to gain momentum. Reach out to those who shared the original on social media and tip them off to your new-and-improved content.

For publications that linked to the original, connect with their editors and show genuine interest in providing value to their audiences. Share your content with them and be clear about what you’re asking. Don’t be afraid to ask them to share it.

Conclusion

The keystone of the Skyscraper Technique is value, so make sure you’ve created a truly comprehensive piece with takeaways that align with the audience of the original. Through targeted outreach and expanded distribution to new channels, you should see a sharp increase in traffic and backlinks for your Skyscraper content.

Then continue to replicate the process for other great content you already created to expand your audience and visibility.

Have you used the Skyscraper Technique for building traffic and links? Share your results with me in the comments below.

Image Credits

Featured Image: cegoh/DepositPhotos.com
All screenshots by Aaron Agius. Taken September 2016.


Tom Anthony Talks About the Future of Search on #MarketingNerds

Visit our Marketing Nerds archive to listen to other Marketing Nerds podcasts!

In this Marketing Nerds episode, SEJ Chief Social Media Strategist, Brent Csutoras, was joined by Tom Anthony, Head of Research & Development at Distilled, to talk about the future of search, other technology trends, and how to put it all together to understand the main trajectories in the industry.

#MarketingNerds: Tom Anthony on the Future of Search | SEJ

Here are a few excerpts from Brent and Tom’s conversation, but be sure to listen to the podcast to hear everything.

Why Thinking About the Future is Good for SEO

Thinking about the future, I just think is so important for doing good SEO. We don’t want to be doing SEO in a way that next year, we’re undoing the things that we did this year.

So, I pushed for Will and Duncan, the founders of Distilled, that we should have a department that just focuses on trying to understand the future—trying to know where is Google taking us and how we can use our understanding of that right now.

So, a large part of my role is looking at the publications that Google puts out, looking out anything they’re blogging about, the other technology trends, trying to put it all together, and thread everything together to understand the main trajectories.

It’s impossible to understand exactly where we will go, but if you understand the main direction, [it’s] surprising how often that’s helpful in setting strategies for our clients.

How Voice Search Will Affect the Future of Search

Voice is something we’ve been talking about a lot, and it’s similar to how we talked about the year of the mobile. At some point in that conversation, we surpassed the moment that we were in the year of the mobile, so it was the era of the mobile, but there was never a watershed moment.

We’ve been talking about voice search for a couple of years now. There’s not going to be a watershed moment. It’s just going to slowly increase in importance until, one day, somebody will realize how important it is.

I think in May this year, Bing said that 20% of mobile queries are now voice searches, and if you look at the fact that mobile traffic almost doubled in 2015, then you start to realize, “Okay, this is starting to become something that’s appreciably huge.”

20% of the search pie is on mobile which means it is something we should be paying attention to.

Issues in Adopting Voice Search Technology

I think there is a concern and there isn’t a concern. You got to remember that you and I are in the industry. We’re playing with technology earlier than most people are starting to use them.

By the time they start to get more mainstream, the error rate in the last two years dropped from 25% to 8% for Good voice queries. Apple has talked about similar numbers with the machine learning that they’ve been doing to understand Siri queries.

Google put a related function on Android where you can lock the phone with just your voice, so it recognizes your voice. I think we might start to see devices doing that sort of learning. Apple is doing a lot of machine learning where their learning happens on the device as part of the privacy standpoint.

How Can Marketers Pay Attention to Voice Search?

The exciting thing about voice search, from my perspective, isn’t the fact that it’s using your voice. It’s the fact that when you speak a query you naturally use natural language.

This lends itself to doing that sort of conversational query type thing, where I revise a query. So I ask a question, and then I say, “Actually, Siri, I meant this, so find me a list of recipes books. Actually, just show me the vegetarian ones.” You start getting what I call compound queries where I do a follow-up query to either revise my initial query or ask something else.

Once you have this natural language interface, you start to remove some of the barriers of having a visual interface. As soon as you have a natural language interface, and there are no visual indicators, you remove all of those barriers, and so people start doing far more complex queries than ever before.

I think that’s going to have a massive knock-on effect regarding the types of queries we see people doing, which is going to lead to us having to do SEO in a different way.

An example is something I call faceted search. At the moment, a typical e-commerce search query might be I search for fridge/freezers in Google. I get a list of ten blue links. I click on one. I go off to a website. I see a whole bunch of fridges. There, I might filter and sort those fridges, and then none of those are right. I bounce back to the search engine. I click another blue link, and I go off into another round of filtering and sorting.

What we’re going to see happen is people expect to be able to do all of that inside the search engine, so I’m going to say, “Show me a list of fridge/freezers that have a capacity of 40 liters for the freezer section or more,” and Google is going to have to be able to start answering those queries.

Making Computers Do the Work for You

That’s the terrifying, exciting future; but we’re not there yet.

It’s another one of those things. It will gradually creep up on us, and I think we’re already going to start to see the computers doing stuff for us.

We’re already starting to see that. If you look at Google inbox, 10% of the messages people send from the Google Inbox app have been written by a machine-learning app called Smart Answers that you send me an email, and it just suggests what I should reply. And 10% of the replies people are sending are being written by the algorithm, so we’re already starting to see computers doing stuff for us.

At some point down the line, it’s not even going to wait for me to say, “Yes, send that email.” It’s just going to send it for me.

I think we’ll see in the next two or three years is something that I call computational queries.

We’re going to see the computer starting to help you process a query. For example, I might say, “Find vegetarian restaurants near the city center which are well-rated and let me know which are closest.”

That search query is quite complex because it needs to know my context, it needs to know what city center am I talking about. It needs to go away and look up those vegetarian restaurants; then it needs to look up how well rated they are. It needs to understand what do I mean by well-rated, because what I mean by well-rated might be different to what you mean by well-rated, so it’s got a personalization aspect.

Then I want to know which ones are closest, so then it needs to go and look up that information based on the results of the first part of the query, but it needs to do all of that in one go, in one step, and those computational queries.

I think it’s very realistic that we will see those in the next two or three years, and those could have a dramatic impact on SEO.

Conversational Search Queries

There’s an interview that was published last year with a Google engineer. I can’t remember who it was, but he talked about exactly that scenario where you ask Google a query.

It’s probably going to be a voice query because that lends itself to this conversational approach, and it says, “Did you mean this or that?” It was trying to make sure it properly understood your intent because that’s something that has been a key goal of Google with search queries.

To listen to this Marketing Nerds Podcast with Brent Csutoras and Tom Anthony:

Think you have what it takes to be a Marketing Nerd? If so, message Kelsey Jones on Twitter, or email her at kelsey [at] searchenginejournal.com.

Visit our Marketing Nerds archive to listen to other Marketing Nerds podcasts!

 

Image Credits

Featured Image: Image by Paulo Bobita
In-post Photo: LDProd/DepositPhotos.com


Thursday 29 September 2016

SearchCap: Google Penguin recoveries, voice enabled maps & Landy Awards

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: Google Penguin recoveries, voice enabled maps & Landy Awards appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

iProspect, Razorfish, Wolfgang Digital earn multiple wins at 2016 Landy Awards

iprospect-2016-landy-awards

Three agencies picked up multiple awards Wednesday night as Search Engine Land hosted its second annual Landy Awards in front of a full crowd of more than 300 search marketers at Edison Ballroom in New York City.

iProspect won the coveted SEO Agency Of The Year award and also took home a trophy in the Best B2B SEM Initiative – Enterprise category. It’s the second straight year that iProspect has won multiple Landys.

But they weren’t the only agency to walk away with multiple honors.

Razorfish picked up wins in both the Best Mobile SEM Initiative and Best Retail SEM Initiative categories. Those are Razorfish’s first Landy Awards. And Wolfgang Digital, a Dublin, Ireland-based digital agency, picked up its first Landys in the Best SEM Initiative – Small Business and Best Cross-Channel Integration of Search categories.

The highly-coveted final award of the night, Best Overall Search Marketing Initiative, went to Edelman. Jennifer Johnstone of Piston Agency was named female Search Marketer Of The Year, and Eric Enge of Stone Temple Consulting won male Search Marketer Of The Year.

Without further ado, here’s the full list of 2016 Landy Award winners:

Best Local SEM Initiative
C-4 Analytics for a luxury car dealership

Best Local SEO Initiative
DAGMAR Marketing for Turner Pest Control

Best Mobile SEM Initiative
Razorfish for Lane Bryant

Best Retail SEM Initiative
Razorfish for Lane Bryant

Best Retail SEO Initiative
Go Fish Digital for a jewelry retailer

Best SEM Initiative – Small Business
Wolfgang Digital for McElhinneys

Best SEO Initiative – Small Business
Noble Studios for Tahoe South

Best Overall SEO Initiative – Enterprise
iCrossing for Microsoft

Best B2B SEM Initiative – Enterprise
iProspect for Microsoft Office 365

Best B2C SEM Initiative – Enterprise
Point It for Microsoft Store

Best Overall SEM Initiative – Enterprise
Acronym for Scotts Miracle-Gro

Best Cross-Channel Integration Of Search
Wolfgang Digital for Littlewoods Ireland

Search Marketer Of The Year
Jennifer Johnstone

Search Marketer Of The Year
Eric Enge

In-House Team Of The Year – SEO
Lenovo

In-House Team Of The Year – SEM
UPMC Health Plan

Agency Of The Year – SEM
Jellyfish

Agency Of The Year – SEO
iProspect

Best Overall Search Marketing Initiative
Edelman for Zagg

This year’s awards featured 19 categories, up from a dozen in 2015, and nearly 200 separate entries, which was almost double last year’s count. The entries were judged by an esteemed panel of search industry experts, including representatives from Google and Bing, as well as Search Engine Land’s editorial staff and contributors.

The event raised $5000 for the SoulMates program at Dana-Farber Cancer Institute, a choice that was inspired by the experiences of our very own Monica Wright, who was diagnosed with breast cancer last November.

Stay tuned to Search Engine Land in the coming weeks as we profile some of the 2016 Landy Award winners and their successful search marketing campaigns.



Google Maps Maximizes Voice Search Capabilities in Latest Update by @SouthernSEJ


Google Maps has updated its mobile app with new voice search capabilities which can help people do more while keeping their eyes on the road.

Now, after entering driving mode, the “OK Google” voice command can be used in various ways. It will be obvious when a voice command can be used at a given time by looking for the white microphone icon in the top right.

Uttering the phrase “OK Google” will trigger the app to start listening to for your next command. Alternatively, you can just tap on the microphone icon and proceed with your voice command. From there, you can tell the app to perform tasks or find information for you. Such as “find the next rest stop”, or “what’s my ETA?”

In order for this to work, it has to be set up properly. To do that, open the Google Maps app and tap on the three-dot menu button on the bottom right. Then go to Settings > “OK Google detection” > turn on “While driving”.

In addition to the usual voice commands, Google has added some new ones, including:

  • “Show / Hide traffic”
  • “Mute / Unmute voice guidance”
  • “Avoid tolls / highways / ferries”
  • “How’s traffic ahead?”
  • “Show alternate routes”

Here is a full list of the new voice commands you can use with Google Maps. To start using them, update the Google Maps app on either iOS or Android, and then hit the road. Safe travels!


Prepping SEO for 2017: it’s all about the ROI

2017-budget-ss-1920

Fall is in the air, and that can only mean one thing for most digital marketers: budget season.

The approaching fourth quarter is often the time when companies begin the budget and planning process for the next fiscal year. And it seems that ROI, while always considered a top priority, has renewed importance now. Advertising Age recently reported that intense demand for ROI is causing companies to replace their CMOs at a rapid rate — as much as a 48-percent turnover in top retailers.

You’d think that ROI would be easy to track on digital, right? Compared to offline media, digital clearly has a tracking advantage. But integrating tracking correctly can be difficult, especially for what may be influencing channels and not the final purchase channel, which can be the case at times for organic search and SEO.

So what’s the answer? How do you integrate SEO into the tracking mix and prove the organic search channel’s ROI? How you’ll track ROI may differ based on the tools and data you have access to in your organization.

Determine your attribution model

If your organization hasn’t yet determined the attribution model to use, that’s where you’ll need to start. The attribution model is the basis for allocating credit to each marketing channel. There is no correct or incorrect attribution model or one that applies to all organizations. Each model is different, and you’ll need to decide which model best fits your business.

The most common attribution models are single-source attribution, measuring first touch or last touch. First-touch, as the name implies, gives all credit to the first channel or lead source that brought the customer or lead to your website. The first-touch channel is recorded and then never changed. By contrast, the last-touch attribution model credits the last channel the customer or lead used to come to your website. The last-touch channel is always updating as the customer or lead continues to interact with your site over time.

In part, these attribution models are most common because many measurement tools, like marketing automation or CRM (customer relationship management), often only have one field to store attribution data. Unfortunately, first- or last-touch attribution essentially ignores all of the channels that may have influenced a customer or lead in the process.

If you want to use a model that gives some level of credit to all channels that may have influenced along the way, consider a fractional attribution model, such as linear or time decay. Linear attribution credits all influencing channels equally, whereas time decay gives the most credit to the most recent channel and the least credit to the oldest channel for that customer or lead.

To track each lead’s or customer’s fractional attribution, however, you will likely need a new field created in marketing automation and/or CRM. I’ve often created what I call a “running lead source” field, which appends the last lead source to the end of the field every time a new channel is encountered by this lead:

running-lead-source

I can then download my leads into an Excel spreadsheet and break apart and examine the array of data in this field. I also find this approach useful for reviewing which content pieces had an impact on the buying cycle for leads.

Set up Google Search Console

Ideally, your website(s) already have Google Search Console (GSC) set up. And I expect that for many, it goes without saying that GSC is an essential SEO tool, helping you to understand measurements that you can’t typically ascertain on your own.

For instance, when we advertise using Google AdWords or other paid search platforms, the platforms provide us with impression data and click through rate (CTR). This helps us to understand how many people, when presented with our message, seemed interested enough to click through.

With organic search, however, it’s a bit more difficult. You don’t know how many people searched for your keywords and the CTR — unless you use GSC.

However, there are a few pitfalls to avoid with GSC:

  • GSC only keeps the last 90 days of search history. Be sure to download CSVs of data from GSC before that data is erased.
  • HTTP and HTTPS must be tracked under separate GSC website properties. Ideally, you should move to HTTPS for SEO reasons too, but if you still use both HTTP and HTTPS, you’ll need to have a GSC website property for each and then combine impressions and clicks data from each. However, do not combine or average percentage data, like click-through rate. You’d basically be taking an average of an average, which is inaccurate. You’ll need to calculate this on your own after combining the raw impressions and clicks first.
  • Mobile sites and desktop sites may be tracked separately. If you have a separate mobile website (not necessarily a responsive site) such as m.domain.com, then you’ll need a separate website property in GSC to track it.
  • Subdomains must be tracked on their own GSC website property. I know, not ideal. But Google clarifies how GSC sees domains in this highlight:subdomains

If you have to create multiple website properties in GSC, you can now at least tie them together using Property Sets, which allow you to see the data in a combined report.

Optimize website analytics

Whether you use Google Analytics (GA) or another website analytics package, website analytics data is incredibly helpful for understanding ROI. With GA, there are several steps I recommend to help track ROI:

  • Add your site’s domain to the referral exclusion list. Recently, I shared how a tracking change in Universal GA causes many sites to incorrectly attribute some traffic to the site’s own domain as a referral. Fixing this problem for one of my clients caused them to see an uptick of 16 percent week over week in organic search traffic that had been previously attributed as self-referral.
  • Set up goals. Goals can be any call to action on your website, but for ROI, you’ll likely want to track goals that are directly attributable to lead generation or purchase, such as a request for a quote or even a newsletter signup. Remember to be judicious in how you use your goals, because each reporting view is limited to 20 goals.
  • Set up e-commerce tracking. I expect most e-commerce companies already do this, but if not, be sure to set up e-commerce tracking in GA. It can give you an immediate view into actual sales from various channels.
  • Use the Attribution Modeling tool. Setting up goals and e-commerce tracking also helps provide more information in the GA Attribution Modeling tool. This tool allows you to compare various attribution models and determine which channels are performing best.
  • Connect GSC to GA. It’s helpful to have much of GSC’s data directly in GA.

If you want to get really sophisticated, you can try to upload your offline sales data into GA as well, using Data Import. Data can be uploaded manually or via the API; so if you don’t have a developer who can help you, it can be a highly manual process. Data Import will then reveal much of what you need to know about sales data directly in GA, including lifetime value. However, it still does not allow you to personally identify specific customers or prospects — just overall trends.

Integrate marketing automation and CRM

While website analytics are helpful, they cannot identify individual buyers and how those individual buyers found your site. Also, it’s difficult to ascertain lifetime value of a marketing channel when you can’t ascertain the lifetime value of an individual customer through a given marketing platform. That’s where your marketing automation and CRM tools come in.

Unlike Google Analytics, which dictates in its terms of use that you cannot have personally identifiable information, marketing automation and CRM are all about personally identifiable information.

Earlier I mentioned the running lead source field I created in my Marketo and Salesforce.com platforms. This allows me to pull data from Salesforce, along with lead status and opportunity and value information to determine which lead sources contributed to actual qualified leads, opportunities and total sales.

Begin to measure and report

In every case I’ve seen, organic search plays a significant role, if not the most important role, in conversion. Here’s the model I like to use to demonstrate the value of SEO in an ROI report, showing all of the stages that an organic search visitor likely came through. I use this particular table in Excel to calculate B2B ROI from a first- or last-touch attribution model:

roi-b2b

Another good report to run to determine the value of each channel is separate from ROI — Average Order Value (AOV) and Average Lifetime Value. If you are an e-commerce company, then you can likely track this by customer in your e-commerce platform. But when you’re tracking offline sales, you may need to calculate this yourself.

If you use Data Import for GA, you can track AOV in GA. Average lifetime value may be more difficult to track directly in GA, so you can use this table to help you calculate that:

aov lifetime

Once you have a list of all of your customers from the organic channel, you can determine what the average lifetime value is across the organic search channel by dividing the total lifetime value of all customers in this channel combined by total customers in this channel.

These tables are important because, when run against other channels, you’ll often find that organic search has high values. This can certainly help justify your value and the value of your service to the company.

If you use a fractional attribution model, however, you can’t really use the table above, as you might double-count conversions and sales against multiple channels. That’s where things get a bit more complicated. You’ll likely need to assign a percentage value to each channel that touched the customer, then only attribute a percentage of that sale’s value to each channel.

Use conversion rate optimization to improve organic conversion

Once you finally know these numbers from organic search, begin focusing on how to improve them. If you’re driving lots of organic traffic to your site, but that traffic isn’t meeting your site goals (lead generation or purchase), then consider how you can test improvements to your site through conversion rate optimization techniques.

Since our ultimate measurement is ROI, it’s not enough for marketers to consider SEO successful just because organic site traffic is high. ROI isn’t about traffic — it’s about revenue. Do everything you can to improve that progression from organic search visit to conversion so that those visitors have a greater opportunity to influence your ROI.


Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.



The myth of the duplicate content penalty

Reality check, myth vs facts on duplicate content

Many people are more afraid of duplicate content than they are of spammy links.

There are so many myths around duplicate content that people actually think it causes a penalty and that their pages will compete against each other and hurt their website. I see forum posts, Reddit threads, technical audits, tools, and even SEO news websites publishing articles that show people clearly don’t understand how Google treats duplicate content.

Google tried to kill off the myths around duplicate content years ago. Susan Moska posted on the Google Webmaster blog in 2008:

Let’s put this to bed once and for all, folks: There’s no such thing as a “duplicate content penalty.” At least, not in the way most people mean when they say that.

You can help your fellow webmasters by not perpetuating the myth of duplicate content penalties!

Sorry we failed you, Susan.

What is duplicate content?

According to Google:

Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar. Mostly, this is not deceptive in origin.

People mistake duplicate content for a penalty because of how Google handles it. Really, the duplicates are just being filtered in the search results. You can see this for yourself by adding &filter=0 to the end of the URL and removing the filtering.

Adding &filter=0 to the end of the page URL on a search for “raleigh seo meetup” will show me the exact same page twice. I’m not saying Meetup has done a good job with this, since they actually indicate the two versions (HTTP and HTTPS in this case) are both correct in their use of canonical tags, but I think it does show that the exact same page (or similar pages) are actually indexed, and only the most relevant is being shown. It’s not that the page is necessarily competing or doing any harm to the website itself.

Raleigh SEO Meetup SERPs showing duplicate content filtering

How much of the web is duplicated?

According to Matt Cutts, 25 to 30 percent of the web is duplicate content. A recent study by Raven Tools based on data from their site auditor tool found a similar result, in that 29 percent of pages had duplicate content.

What are Google’s thoughts on duplicate content?

Many great posts have been published by Googlers. I’m going to give you a summary of the best parts, but I recommend reading over the posts as well.

  • Duplicate content doesn’t cause your site to be penalized.
  • Googlers know that users want diversity in the search results and not the same article over and over, so they choose to consolidate and show only one version.
  • Google actually designed algorithms to prevent duplicate content from affecting webmasters. These algorithms group the various versions into a cluster, the “best” URL in the cluster is displayed, and they actually consolidate various signals (such as links) from pages within that cluster to the one being shown. They even went as far as saying, “If you don’t want to worry about sorting through duplication on your site, you can let us worry about it instead.”
  • Duplicate content is not grounds for action unless its intent is to manipulate search results.
  • About the worst thing that can happen from this filtering is that a less desirable version of the page will be shown in search results.
  • Google tries to determine the original source of the content and display that one.
  • If someone is duplicating your content without permission, you can request to have it removed by filing a request under the Digital Millennium Copyright Act.
  • Do not block access to duplicate content. If they can’t crawl all the versions, they can’t consolidate the signals.

Sources:

Deftly dealing with duplicate content
Duplicate content due to scrapers
Google, duplicate content caused by URL parameters, and you
Duplicate content summit at SMX Advanced
Learn the impact of duplicate URLs
Duplicate content (Search Console Help)

Causes of duplicate content

  • HTTP and HTTPS
  • www and non-www
  • Parameters and faceted navigation
  • Session IDs
  • Trailing slashes
  • Index pages
  • Alternate page versions such as m. or AMP pages or print
  • Dev/hosting environments
  • Pagination
  • Scrapers
  • Country/language versions

Solutions to duplicate content

The solution will depend on the particular situation:

  • Do nothing and hope Google gets it right. While I wouldn’t recommend this course of action, you may have read previously that Google will cluster the pages and consolidate the signals, effectively handling duplicate content issues for you.
  • Canonical tags. These tags are used to consolidate signals and pick your preferred version. It’s a pet peeve of mine when a website has canonical tags set correctly and I see an audit that says there are duplicate content issues. It’s not an issue at that point, so don’t say that it is.
  • 301 redirects. This would prevent pages from even having most duplication issues by preventing some alternate versions from being displayed.
  • Tell Google how to handle URL parameters. Setting these up tells Google what the parameters are actually doing instead of letting them try to figure it out.
  • Rel=”alternate”. Used to consolidate alternate versions of a page, such as mobile or various country/language pages. With country/language in particular, hreflang is used to show the correct country/language page in the search results. A few months ago, Google’s John Mueller, answering a question in the Webmaster Hangout, said that fixing hreflang wouldn’t increase rankings but would only help the correct version show. This is likely because Google has already identified the alternate versions and consolidated the signals for the different pages.
  • Rel=”prev” and rel=”next”. Used for pagination.
  • Follow syndication best practices.

TL;DR

There are some things that could actually cause problems, such as scraping/spam, but for the most part, problems would be caused by the websites themselves. Don’t disallow in robots.txt, don’t nofollow, don’t noindex, don’t canonical from pages targeting longer-tail to overview-type pages, but do use the signals mentioned above for your particular issues to indicate how you want the content to be treated. Check out Google’s help section on duplicate content.

Myths about duplicate content penalties need to die. Audits, tools and misunderstandings need correct information, or this myth might be around for another 10 years. There are plenty of ways to consolidate signals across multiple pages, and even if you don’t use them, Google will try to consolidate the signals for you.


Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.