Wednesday, 30 November 2016

SearchCap: Google drops feature phones, Sitemap file size increases & videos in panels

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: Google drops feature phones, Sitemap file size increases & videos in panels appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

How Google is tackling fake news, and why it should not do it alone

google-news-2015a-fade-ss-1920

Fact-checking and preventing fake news from appearing in search results will remain a big priority for search engines in 2017.

Following the US election and Brexit, increased focus is being placed on how social networks and search engines can avoid showing “fake news” to users. However, this is a battle that search engines cannot — and more fundamentally, should not — fight alone.

With search engines providing a key way people consume information, it is obviously problematic if they can both decide what the truth is and label content as the truth. This power might not be abused now, but there is no guarantee of the safe governance of such organizations in the future.

Here are five key ways Google can deal (or already is dealing) with fake news right now. They are:

  1. Manually reviewing websites
  2. Algorithmically demoting fake news
  3. Removing incentives to create fake news
  4. Signaling when content has been fact-checked
  5. Funding fact-checking organizations

1. Manually reviewing websites

Google does have the power to determine who does and does not appear in their various listings. To appear in Google News, publishers must meet Google’s guidelines, then apply for inclusion and submit to a manual review. This is not the case with the content that appears in traditional organic listings.

Understanding how each part of the search results is populated, and the requirements for inclusion, can be confusing. It’s a common misconception that the content within the “In the news” box is Google News content. It’s not. It may include content from Google News, but after a change in 2014, this box can pull in content from traditional search listings as well.

In the News

“In the news” appears at the top of the page for certain queries and includes stories that have been approved for inclusion in Google News (shown above) as well as other, non-vetted stories from across the web.

That’s why Google was criticized last week for showing a fake news story that reported a popular vote win for Trump. The fake story appeared in the “In the news” box, despite not being Google News (so it was not manually reviewed).

There needs to be better transparency about what content constitutes Google News results and what doesn’t. Labeling something as “news” may give it increased credibility for users, when in reality it hasn’t undergone any manual review.

Google will likely avoid changing the carousel to a pure Google News product, as this may create concerns with news outlets that Google is monetizing the traffic they believe is being “stolen” from them. Unless Google removes any ads appearing against organic listings when a news universal result appears, Google has to make this carousel an aggregation of the net.

It hasn’t been confirmed yet at time of writing, but there is speculation that Google is planning to reduce the ambiguity of the “In the news” listings by replacing it with “Top stories” (as seen in its mobile search results). Like content from the “In the news” box, these listings have been a mashup of Google News and normal search listings, with the common trait being that these pages are AMP-enabled.

Top Stories Screenshot

“Top stories” consists of AMP web pages.

In my opinion, “Top stories” still implies an element of curation, so perhaps something like “Popular stories from across the web” may work better.

Manual review isn’t viable for the entire web, but it’s a start that items from Google News publishers are manually reviewed before inclusion. The opportunity here is to be more transparent about where content has been reviewed and where it hasn’t.

2. Algorithmically demoting fake news

Traditionally, search engines have indirectly dealt with fake news through showing users the most authoritative search results. The assumption is that domains with higher authority and trust will be less likely to report fake news.

It’s another debate whether “authority” publications actually report on the truth, of course. But the majority of their content is truthful, and this helps to ensure fake news is less likely to appear in search results.

The problem is, the very ranking signals search engines use to determine authority can elevate fake news sites when their content goes viral and becomes popular. That is why, in the above example, the fake news performed well in search results.

Google’s ability to algorithmically determine “facts” has been called into doubt. Last week, Danny Sullivan on Marketing Land gave several case studies where Google gets it wrong (sometimes comically) and outlines some of the challenges of algorithmically determining the truth based on the internet.

Google has denied that TrustRank exists, but perhaps we’ll see the introduction of a “TruthRank.” There will be a series of “truth beacons,” in the same way the TrustRank patent outlines. A score could be appended based on the number of citations against truth-checking services.

3. Removing the incentive to create fake news

There are two main goals for creating fake news websites: money and influence. Google not only needs to prevent this material from appearing in search results but also needs to play a role in restricting the financial incentive to do it in the first place.

Google AdSense is one of the largest ad networks, and it was one of the largest sources of income for authors creating fake news. One author of fake news around the US election was reportedly making $10,000 per month.

Earlier this month, both Facebook and Google banned fake news sites from utilizing their ad networks. This is a massive step forward and one that should make a big difference. There are other ad networks, but they will have smaller inventory and should receive pressure to follow suit.

A Google spokesperson said:

“Moving forward, we will restrict ad serving on pages that misrepresent, misstate, or conceal information about the publisher, the publisher’s content or the primary purpose of the web property.”

Google can do little to reduce the incentive to create fake news for the purpose of political influence. If the effectiveness of producing fake news is reduced, and it culturally becomes unacceptable, it is less likely it would be used by political organizations and individuals.

4. Signaling when content has been fact-checked

In October, Google introduced a “Fact Check” label for stories in Google News, their objective being “to shine a light on [the fact-checking community’s] efforts to divine fact from fiction, wisdom from spin.” The label now appears alongside previously existing labels such as “opinion,” “local source” and “highly cited.”

Fact-checking sites that meet Google’s criteria can apply to have their services be included, and publishers can reference sources using the Claim Review Schema.

The kind of populism politics that has surfaced in both America and the UK is cynical of the establishment, and this cynicism could very easily extend to any fact-checking organization(s).

Trump has claimed the media is biased, specifically calling out sources such as The New York Times and The Washington Post. Any attack from influential people on the fact-checking organizations could quickly undermine their credibility in the eyes of populists. It needs to be communicated clearly that the facts are not defined by Google and that they are from neutral, objective sources.

Fact check label

Google has introduced a new “Fact Check” label.

These labels only apply to Google News, but it will be interesting to see if and how Google can expand it to the main index.

 5. Funding fact-checking organizations

Of course, Google should not be defining what the truth is. Having the power to both define veracity and present it back to society concentrates power that could be abused in the future.

Google, therefore, has a large dependency on other organizations to do this task — and a large interest in seeing it happen. The smart thing Google has done is to fund such organizations, and this month it has given €150,000 to three UK organizations working on fact-checking projects (plus others elsewhere in the world).

One of the UK organizations is Full Fact. Full Fact is working on the first fully automated fact-checking tool, which will lend scalability to the efforts of journalists and media companies.

Full Fact caps the amount of donations they can receive from any one organization to 15 percent to avoid commercial interests and preserve objectivity. This is the counter-argument to any cynics suggesting Google’s donation isn’t large enough and doesn’t represent the size of the task.

Google needs accurate sources of information to present back to users, and funding fact-checking organizations will accelerate progress.

To wrap up

Casting aside all of the challenges Google faces, there are deep-rooted issues in defining what constitutes the truth, the parameters of truth that are acceptable and the governance of representing it back to society.

For Google to appear to be objective in their representation of truth, they need to avoid getting involved in defining it. They have a massive interest in this, though, and that’s the reason they have invested money into various fact-checking services.

Over the past decade, it’s possible to point to where the main focus of search engines has been, e.g., content or links. Going forward, I think we will see fact-checking and the tackling of fake news as high a priority as any other.

Google needs to act as a conduit between the user and the truth — and not define it.


Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.



Removing duplicates in Yext: still a hands-on process

yext-logo-1200

A recent case study by Yext shows the impact of duplicate listings on local rankings in Google. Coordinated by local search expert Andrew Shotland, the core research evaluates Yext duplicate suppression for a national restaurant chain.

I’m not here to dispute the research or results. Having consistent NAP (name, address and phone number) has long been regarded as a priority for local businesses. However, there are limitations to using Yext for finding and removing duplicates.

Launched in June of 2014, duplicate listing suppression has been a selling point for the Yext platform. Different from deletion, suppression redirects search engines and customers to the correct information on a particular website. The suppression happens as long as a client or the agency is a paid subscriber to Yext.

Full disclosure: I am Yext-certified and currently manage 60 unique clients in Yext. It is a powerful platform and useful for scaling citation management. Yet when it comes to duplicate suppression, there are many areas where Yext can improve.

The platform isn’t created to find all duplicates

Doing a Yext Scan is a fun way to show clients all the issues with their local listings. But it isn’t set up to show multiple duplicates for a single citation, and it includes only 53 sites. It appears to me that Yext cherry-picks to create a report with the goal of showing as many mistakes as possible.

Once client details are added to Location Manager and PowerListings have started to sync, Yext will crawl online for duplicates. The possible duplicates tab are those Yext has automatically found in their network.

duplicate listings

The platform does look for name, address and phone duplicates, although it isn’t comprehensive. Yext especially has difficulties where a business name has changed or is using multiple assumed names. Data where only the phone or only the address is a match to a duplicate business are frequently missed.

A user of Yext can also submit duplicates through the platform, which is a common occurrence. Yext requires a URL of the duplicate, but what happens next is where the platform could really be improved.

remove duplicates yext

Not all duplicates can be suppressed

A client of mine had two duplicates in CitySearch, which Yext didn’t find and required manual submission. A month later, the duplicates were still not flagged in the system.

Sometimes, Yext reveals which duplicates are being processed, and other times it doesn’t. Even worse, it can sometimes tell you a duplicate is being suppressed when it isn’t. For the same client, two duplicates in Superpages were shown in Yext as being suppressed. However, these listings were still live on Superpages and being crawled by Google.

suppression error

Another option is to submit duplicates to Yext support. Below is a quote from a support agent, on my request to remove duplicates for a client that purchased a previously used phone number.

“The listings that only match the phone number do not follow our 2/3 guidelines. We are not able to submit another business’ listing for suppression. It is the responsibility of that business to correct the phone number on their listings if they are no longer using it.”

Yext does not suppress at the source

Not all local citations are included in the Yext PowerListings Network. Even sites in the network, such as Factual, don’t allow for duplicate suppression. A user is still required to submit a manual duplicate ticket for Factual.

In addition to Factual, data aggregators Express Update/InfoUSA, Neustar Localeze, Axciom and Dun & Bradstreet are excluded from the Yext network. These are often the source of duplicates in Yext and many other sites online.

An SEO consultant should still catalog correct and incorrect NAP in a spreadsheet and check Google and important citations for more duplicates. Moz Local can be used to scan data aggregators.

Yext could be pushing incorrect data

It doesn’t happen as often, but there are some scenarios where Yext could be pushing duplicate and inaccurate data.

The first is not having access to an existing Yext account. An existing PowerListing subscription could be sending incorrect data. You will not be able to add a location to a second Yext account until it is removed from the original.

For removal, Yext tends to require permission from the account owner. I have been unsuccessful at this in a few cases. One was for an HVAC client partnered with Lennox, which automatically subscribes all authorized dealers to a PowerListings subscription. Unfortunately, Lennox required that a tracking phone number and their own landing page be published in place of the client’s local number and website.

The second scenario is NAP accuracy. Yext has some checks on the data entered in Location Manager, but it doesn’t check against a business license or a registered office address. In a recent test, I was able to add a company twice to the PowerListings Network, but with a different phone number.

Despite these flaws, Yext is still in my arsenal for local listing management. If you choose to use Yext to suppress duplicates, understand the strengths and gaps in using the platform to do so.


Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.



Jagdish Chandra Bose Google doodle marks 158th birthday of biophysicist, botanist & crescograph inventor

jagadish-chandra-boses-158th-birthday

Today’s Google doodle honoree has a long list of scientific accomplishments. Born in Bangladesh on this date in 1858, Jagdish Chandra Bose’s professional tenure was spread across several scientific disciplines, including biology, physics and botany.

According to the Google Doodle Blog, Bose’s work included radio and microwave science research, and he is credited with wireless telecommunication innovations. His accomplishments in the field of wireless communication earned him the honor of having a moon crater named for him – the Bose Crater located on the far side of the Moon.

Bose’s crescograph invention – which is highlighted in today’s doodle – measured plant growth and movement, making it possible to identify similarities between animals and plants.

Marking Bose’s 158th birthday, the doodle illustration shows him at work with a plant and his crescograph. The image leads to a search for “Jagdish Chandra Bose” and has a sharing icon to post the doodle on social feeds or send via email.
jagadish-chandra-boses-158th-birthday

As if the study of plants and radio science and physics wasn’t enough – Bose also published “The Story of the Missing One,” a short story that is known as one of the first Bengali science fiction stories.



How brands can win with omnichannel discovery

local-marketing_295688708-ss-1920

Omnichannel marketing is not new, but the concept continues to fascinate writers and practitioners alike, myself included. That’s because consumer behavior keeps changing, and businesses need to be responsive to these changing behaviors so that they can continue to be found.

While brands used to worry about coordinating their marketing across different channels, now they need to respond to the reality that people are using multiple channels, devices and communication modes to get what they want in an increasingly on-demand fashion. For instance, you can now order a pizza with a voice command to Amazon Echo or with an emoji on Twitter.

Welcome to the world of omnichannel discovery.

As I noted in a previous Search Engine Land column, consumers long ago graduated from Google searches on desktops and mobile phones, with an explosion of devices and technologies continuing to shape how and where we search.

According to Nielsen, “Americans now own four digital devices on average, and the average US consumer spends 60 hours a week consuming content across devices.”

Consequently, as Forrester Research noted in a private report, people discover brands and transact with them in a variety of ways, ranging from social networks to apps.

Google calls these touch points micro-moments because people decide what to do and what to buy through rapid moments of decision-making — which can be challenging for brands trying to figure out how to turn omnichannel micro-moments into business at a location level.

Businesses can win in a world of omnichannel discovery by leading consumers through a smooth journey across channels. Doing so means creating the right experience for the right channel and customer. A mobile wallet offer might be perfect for connecting with a consumer doing a mobile search using Google, but not so much for someone using Snapchat to have a playful experience with a brand.

Some brands are giving us a glimpse of how to manage the omnichannel journey by creating experiences appropriate for each channel and device. Here are a few:

Disney

Disney guides Disney World visitors through an omnichannel journey that ranges from the desktop to mobile to wearables. Disney understands that searching for and booking a vacation is a complicated task that usually occurs on the desktop.

Accordingly, the My Disney Experience website contains myriad functions appropriate for planning a visit, ranging from choosing lodging to buying tickets. Click-to-call functionality and live chat make the potentially complicated process a lot easier.

From there, Disney transitions users to the Disney Experience app for discovering things to do and checking crucial information such as wait times for attractions while visitors are on-site.

Disney also offers the increasingly popular MagicBand RFID wearables that make it easy for guests to manage transactions during their stay, such as purchasing food. By storing information about guests (including their locations throughout a day at the parks) through MagicBands, Disney can design more personal experiences based on their preferences.

Domino’s Pizza

Domino’s operates over 5,200 stores in the United States and gets more than 50 percent of its sales through digital channels — with half of those coming from mobile.

Domino’s thrives in an on-demand world by adapting to consumer behavior. Ordering pizzas with emojis and voice commands are not gimmicks because voice-related searches and the use of emojis reflect how people are interacting with each other and with brands in the post-digital age.

I think it’s telling that Domino’s CEO, Patrick Doyle, transitioned the company to the smartphone age by famously tasking his Information Technology team to “Make it so a customer could order a pizza while waiting for a stoplight.” He understood intuitively that mobile was, and remains, rooted in this kind of behavior.

Nowadays, you can also order Domino’s pizzas through a growing number of channels and devices beyond Twitter and Echo, including mobile apps, Facebook messenger bots, your Apple Watch and smart televisions. Domino’s is a great example of creating the right experience for the consumer, channel and device.

What you should do next

Succeeding in a world of omnichannel discovery requires businesses to take stock of their customers’ journeys, develop an omnichannel location data strategy and share content and experiences that turn discovery into commerce. Here are three steps you can take now:

  1. Develop an omnichannel location data strategy. Once you understand every touch point your customer uses, ensure that your brand is visible on each one through location data. In an omnichannel world, it’s not enough to possess accurate data on your stores’ location pages. You’ll need to distribute your location data across all the places where discovery is occurring, ranging from Facebook to mobile apps such as Google Maps.
  2. Understand your customer’s omnichannel journey. Examine the entire journey your customer takes from home to store. Ask questions such as how many touch points do they encounter, and what devices are they using to interact with your brand? Understanding your customer’s omnichannel journey will require you to employ tools such as journey maps, which designers use to research and depict the customer experience journey.
  3. Create next moments appropriate for each channel. As I have discussed previously, a “next moment” is the action that occurs after someone finds your brand through a search. A next moment for a Google search on a mobile device might consist of a mobile wallet offer. A next moment on Instagram or Pinterest might consist of sponsored visual content or promoted pins that encourage shoppers to visit brick-and-mortar stores to take advantage of a sale. In all three cases, the next moment needs to maximize and capitalize on the unique attributes of each channel and device (e.g., highly visual content for Instagram).

The next frontier of omnichannel discovery for businesses will involve using advanced analytics and consumer measurement tools to anticipate consumer discovery and either positioning themselves with the right solution before a search begins or pre-empting the search completely.

By deploying a location data strategy that involves being visible and creating next moments where consumers are conducting “near me” searches, you’ve set up your brand for success as the nature of omnichannel changes.


Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.



Google drops their feature phone crawler & error report in Search Console

Google has officially dropped full support for feature-phone web sites. The post Google drops their feature phone crawler & error report in Search Console appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

How to Find Epic Keyword Opportunities That Turn Into Easy SERP Wins by @josephhhoward

“The best way to improve your online visibility is to write great content!”

We’ve all heard this before. In fact, we hear it over, and over, and over again.

Personally, I can’t stand when people say it.

Why? Because in my opinion, it misses the point entirely!

slipup

Are there advantages to writing great content? Of course.

  • Better conversion. The more engaged your readers are, the more likely they are to give up their email address, like your page on Facebook, or even buy your products or services.
  • Better time-on-page and lower bounce rate. The longer your visitors interact with your content, the better signals are sent to Google about how valuable your writing is to readers.
  • Build more relationships in your industry. When you write something terrific, people take notice. This could help you develop partnerships or interact with people who could help you grow your online business.

But when people tell you that great content is the be-all, end-all to ranking well in search engines, it bugs me!

  • Does that mean you can’t rank well if you’re not a world-class writer? The necessity to produce “great content” suggests that if you’re not one of the best writers in your industry or you don’t have the money to pay one to write for your website, you can’t compete. This couldn’t be further from the truth.
  • It’s about providing value, not “greatness.” People use Google to find a resource that answers their query. The focus of Google since its inception has been to answer searcher intent. That should be your prerogative and if a brilliant piece of writing comes from it, so be it.
  • The chase for greatness ignores low-hanging fruit. If you have a new website and want to write some “great content” to try to outrank a 5000-word guide on a DA 91 site, go right ahead. Chances are you’ll be up the river without a paddle. The trick is to find keyword opportunities you can actually compete for and write content that beats that of your pedestrian competition.

While writing terrific content has its advantages, it’s not the only way to win when it comes to gaining visibility in search engine results. Finding the right keyword opportunities means you only have to write content better than your average competitor, not the entire industry.

My company WP Buffs is a Domain Authority (DA) 19 site. Not very imposing.

domain authority

Yet we’ve managed to rank for some low- and medium-competition long-term keyword phrases. In this case, we managed to rank better than WordPress (DA 100) by optimizing for more niche search phrases.

ranking over wordpress org

Here’s the exact blueprint we use to find keyword opportunities that will allow us to increase our visibility in search results and win at Google.

1. Join All the Email Newsletters

First off, you’ll need to do some information gathering in your industry.

  • Do a quick Google Search for “[your industry] newsletter” and sign up for as many as you come across. It’s important to sign up not only for the ones that seem high-quality, but for ones that are a little rougher around the edges. Remember — we’re looking for keyword opportunities, which means most will come from smaller or less refined websites.
  • Set up Google Alerts for your industry. Because Google indexes every website and webpage across the internet (unless the website owner tells it not to), this will allow you to capture every new article in your industry regardless of whether or not they send out a newsletter.
  • Sign up for Unroll.me to combine all your newsletters and Google alerts into one daily email. This will help you prioritize the important items in your inbox and review content when you decide it’s time.

We’re signed up for every WordPress newsletter on the web. We get tons of emails every day about what our SERP competitors are writing about, and it all comes in one, tidy email every morning.

unroll.me

2. Use SEMrush to Find Long-Tail Keyword Searches

When reviewing newsletters or Google alerts, take every article you find and submit it to SEMrush. It’s great for competitor analysis and keyword research.

When looking over what our WordPress competitors were writing about, we found these two articles:

  • Better Media Management With WordPress Real Media Library: http://ift.tt/2fiP4ws
  • Customizing the WordPress Media Library: http://ift.tt/2ffI84s

We used these articles, along with every article our team read, and used SEMrush to do a bit of competitor research.

There are a few ways we can find keyword opportunities from these articles:

  • Under Domain Analytics > Organic Search > URL, you can find out exactly what keywords this article is ranking for.
  • Under Keyword Analytics > Related Keywords, you can search for related keyword phrases to those in the article title. In this case, we did a search for “WordPress media library.”

Remember — what you’re looking for are long-tail keyword phrases, or searches that contain a string of words instead of just one or two.

By using the related search feature in SEMrush, we found the keyword phrase “WordPress digital asset management” along with a few others.

keyword found

Note: The free version of SEMrush is terrific for base-level research, but if you want to dive even deeper, you may have to sign up for their introductory plan. They offer a seven-day money-back guarantee, so feel free to sign up, play around with it for a week and decide if it’s something you want to use long-term!

3. Do Some Manual Research

Now that you’ve put together a solid list of long-tail keywords, it’s time to see where the best opportunities are for you. This entire strategy will only work if we can find real opportunities in search results to focus a piece of content on.

Before we dive into search results, get the Mozbar Google Chrome plugin. This will allow us to see DA, Page Authority (PA), and links back to a website directly from Google search results! You’ll also need to create a free Moz account.

mozbar

Activate Mozbar and start manually searching Google using the long-tail search phrases you found using SEMrush.

  • Look for searches in which the top search result has a low DA. This is the easiest way to spot a search phrase that gets significant traffic according to SEMrush and provides a good opportunity for you to write content that takes over that #1 spot. A website with DA <40 is usually a great opportunity regardless of the DA of your own site.

squeeze page

  • Always remember searcher intent. If the top three websites have DA 100 but none of them answer the searcher’s intent, that’s also a good opportunity. You may end up ranking fourth, but you’ll get more people clicking through to your website than the three above since you answer their question and they don’t! That’s what we found with the search “WordPress digital asset management.” People would be looking for a more in-depth guide on the topic, not WordPress support.

dam search

  • Click through to websites to see what the content looks like. Often, you’ll find that the content that’s ranking is extremely thin and not helpful to searchers at all. That means an opportunity for you! The top-ranking page underneath the WordPress pages for the search “WordPress digital asset management” had 259 words. A 2000-word guide would trounce that!wordcounter

You can also find long-tail keyword phrases by allowing Google to give you suggestions.

  • Use Google’s autocomplete suggestions to add to your list of long-tail phrases. Google makes these suggestions based on the searches that get the most traffic, so they’re worth considering.

sej google autocomplete

  • Scroll to find more Google suggestions. Often, Google will include a list of related search phrases to the bottom of every page of search results.

sej related searches

4. Writing Tips

The entire point of finding epic keyword opportunities is so you can write something that’s better than an average piece of content but doesn’t necessarily have to be one of the best in your industry. This means that even if you’re not a world-class writer, you can compete in search results by doing a bit of keyword research beforehand.

This means that even if you’re not a world-class writer, you can compete in search results by doing a bit of keyword research beforehand.

That being said, putting together a solid article to help your piece of content rank well can still be challenging. Here are a few tips:

  • Write something long. Google likes long-form content that goes deep to answer a question. Spend three hours writing one 1500-word article instead of three 500-word articles. You’d rather have one piece of content ranking in the top few positions than three articles ranking on the second or third page.
  • Focus on how-to guides. These are the easiest articles to write and show your expertise in the industry. People are more likely to buy from you if they know you solve problems when you encounter them!
  • Focus on evergreen content. This means stay away from news and what’s hot right now. Stick to content that will be useful for searches in the next few years, not just the next few weeks.
  • Make sure your on-page content is fully optimized. That means page title, H1 header, image alt tags, social media tags, etc.
  • Getting backlinks will be helpful, but it’s not always necessary in this case. In this situation, we’re trying to outrank content that’s very average and doesn’t have any high-quality links. Writing long-form content and making sure your on-site optimization is on point should be enough.

Now it’s your turn! This strategy is useful regardless of how competitive your industry is or how authoritative your website is. We’re in the WordPress space which has a lot of major players, and we’re still able to compete in search results and hold our own. Give it a shot and stick with it and over time, you’ll start to see your organic traffic trend upwards.

 

This is my first contribution to SEJ, so if you’ve read this far, I would really appreciate some feedback! Please leave your thoughts in the comments below and let me know if this strategy worked for you too!

Image Credits:
Featured Image: Maarten van den Heuvel/Unsplash.com
In-post Photo: stevepb/Pixabay.com
Screenshots by Joe Howard. Taken November, 2016.

 


5 Types of Google Penalties (And What You Need to Do to Recover) by @IAmAaronAgius

When webmasters fear getting Google penalties, most of them think of the dreaded algorithm updates: Penguin, Panda, the Top Heavy algorithm, etc. Get caught in one of these algorithm sweeps, and you could lose some or even all of your organic search traffic.

But Google actually has a lot more in its arsenal than just algorithms to encourage you to follow their Webmaster Guidelines. Ever heard of a Manual Action Penalty?

What’s a Manual Action Penalty?

Google has teams of search engineers tasked with the job of reviewing individual websites and, if necessary, assigning a rank penalty.

When Google runs Penguin, sites across the web can take a rank hit. A Manual Action Penalty means your site alone has been hit, and it’s your problem to fix.

Manual Actions are the most common type of Google penalty you can get. If you have one, you should see a message in search console about it:

image22

You can go to Search Console and check for yourself right now. Just go to Search Traffic > Manual Actions and see if you have a message:

image20

 

If Google has penalized your site with a manual action, you’ll see a message here describing the penalty. You can either get a Site-wide Match (meaning your whole site is affected), or a Partial Match, meaning only certain pages of your site are penalized.

Google Search Console Help provides a list of 12 common manual actions you can receive (meaning that there are more):

  • Hacked site
  • User-generated spam
  • Spammy freehosts
  • Spammy structured markup
  • Unnatural links to your site
  • Thin content with little or no added value
  • Cloaking and/or sneaky redirects
  • Cloaking: First Click Free violation
  • Unnatural links from your site
  • Pure spam
  • Cloaked images
  • Hidden text and/or keyword stuffing

Some of them, like the Pure spam or Spammy freehosts penalty, aren’t likely to happen to your average webmaster (unless you own a blatant spam site or host a lot of them).

Other manual actions, though, a lot of webmasters could be at risk for.

The good news is you can fix the problems Google’s engineers found on your site, and then request a review of the Manual Action in Search Console:

image30

 

Google’s engineers will review it, and if they approve it, they’ll remove the penalty and allow your pages to start gaining rank again.

Here are five common types of penalties my clients have gotten, and a walk-through of how we’ve helped their sites recover from each.

1. Unnatural Links

There are two different kinds of “Unnatural Links” penalties Google has:

  1. Unnatural Links from Your Site: You are hosting unnatural, artificial, or deceptive outbound links on your site.
  2. Unnatural Links to Your Site: You have unnatural, artificial, or deceptive backlinks pointed at your site.

These manual actions are in line with Google’s Penguin update, meant to penalize people who are participating in link exchanges, buying links, or selling them to manipulate rank.

If you have an unnatural links penalty, here are some examples of the kind of links you need to fix:

  • Paid links
  • Links acquired from link exchanges
  • Spammy guest post links
  • Automatically generated links
  • Spammy forum comment links
  • Irrelevant backlinks
  • Links from low-quality directory or bookmark sites

Clean up the Links From Your Site (Outbound)

Use a link analysis tool like Ahrefs or Majestic to get a list of your outbound links. SEOChat’s Website Crawler is another free option that will analyze 100 pages for you without registering.

Download a list of external links from the report:

image18

 

Identify any links on your site that are against Webmaster Guidelines. Once you find them, you can either:

  • Remove the links
  • Use a 301 redirect attribute through a page blocked by robots.txt
  • Set the links to “nofollow”

Clean up the Links to Your Site (Inbound) 

You can get a list of links pointed at your site using your backlink analyzer of choice, or you can use Search Console. Just click “Search Traffic,” then “Links to Your Site,” and you can download a list.

image21

 

Find any backlinks that are against Webmaster Guidelines.

Next, you’ll need to send out take-down request emails to the webmasters hosting them. If they don’t respond to you, then as a last resort, use Google’s Disavow Tool to tag the links so they don’t pass PageRank.

Once you’ve cleaned up your links, you can move on to submit a reconsideration request.

2. User-Generated Spam

If you’ve gotten a User-Generated Spam penalty, it doesn’t mean you’re a spammer – but your site users are. As far as Google’s concerned, it’s up to you to clean up spammy content people post to your:

  • Forum pages
  • Guest book pages
  • Blog post comments
  • Other site areas

Mozilla famously got penalized by this Manual Action a while back. Here are some user-generated spam examples from their site:

image27

 

If you haven’t already, the first thing you’ll want to do is install some kind of anti-spam software. Akismet is a popular WordPress tool that will detect and filter out some of your spam comments.

Hopefully, this does most of the cleanup work for you, but don’t stop there. You need to manually go through and remove any spam that got through the filters.

Look out for things like:

  • Posts that are blatant advertisements
  • Posts with gibberish text
  • Posts with off-topic links (Probably the most common type of comment spam I see)
  • Commercial-sounding content (Think payday loans, discount insurance, libido enhancers, etc.)
  • Auto-generated comments

You should also vet your user profiles, and delete any that might be spam accounts. These are usually auto-generated, have no profile photo, no description, and of course, post a lot of irrelevant comments.

A User-Generated Spam penalty is one you’re likely to get again and again, unless you take a hard line on spam from now on. To help my clients who have had this penalty prevent spam in the future, we:

  • Use a CAPTCHA on their sites
  • Change all forum links to “nofollow”
  • Allow users to report spam
  • Consider moderating all comments

image28

 

If you have a User-Generated Spam penalty, chances are you have a lot of comments and user accounts to go through. Neil Patel’s Quicksprout got this penalty several times and were faced with nearly 350,000 forum users to sift through.

But it’s worth it to be as thorough as possible, because if Google sees there’s still spam on your site, they’ll reject your reconsideration request.

In my experience, you have three options:

  1. Take the time to go through all your user-generated content yourself.
  2. Hire someone else to do it for you.
  3. Delete all your user-generated content.

In the end, number three is what Neil Patel did. The decision is up to you – if user-generated content is central to your site, it might be worth it to clean everything up.

3. Hacked Site

Some of my clients followed Google’s Webmaster Guidelines to a T, but still ended up with a manual action penalty because their site was hacked.

Hacked sites pose a threat to you and your site users, so Google wants you to clean things up.

image04

 

Keep in mind that if hackers are doing something malicious on your site, you might not even get a “Hacked Site” manual action. I’ve handled several cases where unknowing webmasters have ended up with a “Pure Spam” manual action instead.

Fixing a hack can be a big undertaking, but Google has some helpful resources for what to do.

Here are the basic steps:

Quarantine your site

You don’t know what’s happened to your site or if the problem is spreading.

So the first thing you want to do is take your site offline. You can do this by either stopping your web server or setting up a 503 response code.

You’ll also want to change all the passwords for your site, including system administrators, content management systems, FTP logins, and any other access points.

Identify the type of hack

Next, you need to figure out what kind of hack you have.

Search Console may have left you a message about it to go with your Manual Action Penalty. If not, move on to check the “Security Issues” section of Search Console.

image26

Here are some of the ways you can be hacked:

  • Spam – Someone’s adding spammy pages, links, or text to your site
  • Malware – Someone’s installed software to damage computers
  • Phishing – Someone’s installed software to collect information about your site users

Eliminate the vulnerability

Before you fix any changes hackers made to your site, you need to figure out how they accessed your site in the first place. If you don’t close this hole, they can continue to damage your site.

The problem could be a lot of things, like:

  • A virus-infected computer
  • Weak passwords
  • Out-of-date software
  • Permissive coding practices

If you aren’t comfortable investigating these possibilities yourself, bring a professional in to do it for you.

Clean up the hack

This is another job you’ll probably want a professional to do. They can remove the malware by hand and help you clean up your servers.

If the hacker got access to confidential user information on your site, you’ll have some legal responsibilities as well. Here’s a helpful resource on what to do in that case.

4. Cloaking and/or Sneaky Redirects

If you’ve gotten a manual action penalty for cloaking or sneaky redirects, one of these things happened:

  • Cloaking: You’re showing some content to Google but not to site visitors (either images or text)
  • Sneaky redirects: Your pages indexed in Google redirect users to completely different content

Here’s what I do with clients to help them recover from both.

Check for cloaking

To figure out what the problem is, use Fetch as Google. This tool will show you how Google sees your site.

Take the root address of affected pages from your Manual Actions report, and plug them in:

image06

 

Compare Google’s rendering of your page to how it appears in your browser. If there are any differences, fix them.

Most cloaking is deliberate, but if you aren’t sure why your pages look different, talk to your web developer, SEO agency, and anyone else who has access to your HTML to diagnose the problem.

Repeat the process, rendering different versions of your site pages, including mobile.

Check your redirects

Next, check your redirects using a tool like Screaming Frog. Their report has a “Redirect URI” column so you can analyze each URL destination.

image19

Look for any URLs on your site that redirect somewhere that site visitors probably didn’t want to go. Change these redirects to more relevant pages, or remove the redirect entirely.

Check for deceptive buttons, ads, and plugins

If you use an anti-hotlinking plugin to protect your images and bandwidth, Google could see it as cloaking. You may need to remove the plugin or disable the anti-hotlinking feature.

Also look out for any advertisements on your site that could trick people into clicking on something they wouldn’t have otherwise. These often look like trusted entities but are actually ads:

image31

Any of these things could be responsible for the manual action, so make sure you clean up as much as possible to make Google happy with your site.

5. Thin Content

Google wants to deliver a variety of quality options in search results. If your site is full of shallow, duplicate content, you could get a manual action to keep your pages low in rank.

Here’s what Google means by “thin content,” so you can evaluate your own pages:

Duplicate content from other sites

If you’ve taken content from another site and republished it on yours, this can be considered thin content.

Some webmasters scrape content from somewhere (like Wikipedia), make minor changes, and republish. If you sell products and copy a manufacturer’s product descriptions, that can also count.

image23

 

Thin content with affiliate links

If you have an affiliate site and publish content with the sole purpose of hosting affiliate links, you could get a thin content penalty.

Google wants to see that your content offers more value than what the original affiliate can provide. If your site is full of product descriptions and reviews from the merchant, you need to seriously improve your content.

Duplicate content on your site

Hosting a lot of identical or very similar pages on your own site can also land you a thin content penalty. I’ve seen SEOs come across this problem when they use doorway pages targeting different regions:

image24

 

Auto-generated content

Google sees auto-generated content as thin. This can include auto-translated text, automatically spun content, and text generated from scraping RSS feeds.

Once we’ve found all the potential thin content on my clients’ sites, here are the three options I give them for moving forward:

  1. Delete it. If you scraped someone else’s text or use auto-generated content, this is what you’ll need to do.
  1. Update it so it’s better quality. That includes rewriting product descriptions and adding substance to affiliate pages, and merging doorway pages and deleting unnecessary duplicate content.
  1. Remove it from search (adding noindex meta tags). If you have to keep some of your thin content pages, put this meta tag in the <head> section of those pages:<meta name=”robots” content=”noindex”>Or you can use a tool like SEO by Yoast to do this for you.

After you’ve done your best to clean up your site, you can submit a reconsideration request.

Submitting Your Reconsideration Request

Once we’ve done everything we can to clean up a client’s site and fix whatever caused Google to give them a Manual Action in the first place, it’s time to submit a reconsideration request.

To do this yourself, go back to Search Console and click “Request a Review.”

image29

 

Then you’ll see a box where you can submit your request.

When my clients reach this step, we try to be as specific as possible, including all relevant information about the cleanup process. Sometimes it’s easier to detail it in a Google Doc or Sheets file, then add that to the review request.

Relevant information for your review request might include:

  • A list of bad links you removed from your site
  • A list of spam comments you deleted
  • Details of your malware cleanup

Also be sure to explain how you plan to prevent the same problem in the future.

After you submit, you should get a confirmation from Google. Then hopefully, within a few weeks, you’ll receive communication that the Manual Action has been removed.

If you didn’t do a good enough job and your site’s still violating Webmaster Guidelines, Google will tell you to go back and try again.

The Big Picture

If you get a Manual Action Penalty from Google, it’s not the end of the world. Google created this system to give webmasters an opportunity to clean up their sites and be in line with Webmaster Guidelines.

It could be a lot worse – I know a lot of sites that lost rank from algorithm changes and never fully recovered, no matter what they did to fix the problem.

Just follow the steps in this post to overcome your manual action, and pay close attention to Google’s Webmaster Guidelines overall to avoid another problem in the future.

Have you had a Manual Action Penalty before? Tell me how you recovered in the comments:

 

Image Credits

Featured Image: Pixabay

Screenshots by Aaron Agius. Taken November 2016