An Investigation Into Google’s Maccabees Update 0

Posted by Dom-Woodman

December brought us the latest piece of algorithm update fun. Google rolled out an update which was quickly named the Maccabees update and the articles began rolling in (SEJ , SER).

The webmaster complaints began to come in thick and fast, and I began my normal plan of action: to sit back, relax, and laugh at all the people who have built bad links, spun out low-quality content, or picked a business model that Google has a grudge against (hello, affiliates).

Then I checked one of my sites and saw I’d been hit by it.


Time to check the obvious

I didn’t have access to a lot of sites that were hit by the Maccabees update, but I do have access to a relatively large number of sites, allowing me to try to identify some patterns and work out what was going on. Full disclaimer: This is a relatively large investigation of a single site; it might not generalize out to your own site.

My first point of call was to verify that there weren’t any really obvious issues, the kind which Google hasn’t looked kindly on in the past. This isn’t any sort of official list; it’s more of an internal set of things that I go and check when things go wrong, and badly.

Dodgy links & thin content

I know the site well, so I could rule out dodgy links and serious thin content problems pretty quickly.

(For those of you who’d like some pointers on the kinds of things to check for, follow this link down to the appendix! There’ll be one for each section.)

Index bloat

Index bloat is where a website has managed to accidentally get a large number of non-valuable pages into Google. It can be sign of crawling issues, cannabalization issues, or thin content problems.

Did I call the thin content problem too soon? I did actually have some pretty severe index bloat. The site which had been hit worst by this had the following indexed URLs graph:

However, I’d actually seen that step function-esque index bloat on a couple other client sites, who hadn’t been hit by this update.

In both cases, we’d spent a reasonable amount of time trying to work out why this had happened and where it was happening, but after a lot of log file analysis and Google site: searches, nothing insightful came out of it.

The best guess we ended up with was that Google had changed how they measured indexed URLs. Perhaps it now includes URLs with a non-200 status until they stop checking them? Perhaps it now includes images and other static files, and wasn’t counting them previously?

I haven’t seen any evidence that it’s related to m. URLs or actual index bloat — I’m interested to hear people’s experiences, but in this case I chalked it up as not relevant.

Appendix help link

Poor user experience/slow site

Nope, not the case either. Could it be faster or more user-friendly? Absolutely. Most sites can, but I’d still rate the site as good.

Appendix help link

Overbearing ads or monetization?

Nope, no ads at all.

Appendix help link

The immediate sanity checklist turned up nothing useful, so where to turn next for clues?

Internet theories

Time to plow through various theories on the Internet:

  1. The Maccabees update is mobile-first related

    • Nope, nothing here; it’s a mobile-friendly responsive site. (Both of these first points are summarized here.)
  2. E-commerce/affiliate related
    • I’ve seen this one batted around as well, but neither applied in this case, as the site was neither.
  3. Sites targeting keyword permutations
    • I saw this one from Barry Schwartz; this is the one which comes closest to applying. The site didn’t have a vast number of combination landing pages (for example, one for every single combination of dress size and color), but it does have a lot of user-generated content.

Nothing conclusive here either; time to look at some more data.

Working through Search Console data

We’ve been storing all our search console data in Google’s cloud-based data analytics tool BigQuery for some time, which gives me the luxury of immediately being able to pull out a table and see all the keywords which have dropped.

There were a couple keyword permutations/themes which were particularly badly hit, and I started digging into them. One of the joys of having all the data in a table is that you can do things like plot the rank of each page that ranks for a single keyword over time.

And this finally got me something useful.

The yellow line is the page I want to rank and the page which I’ve seen the best user results from (i.e. lower bounce rates, more pages per session, etc.):

Another example: again, the yellow line represents the page that should be ranking correctly.

In all the cases I found, my primary landing page — which had previously ranked consistently — was now being cannabalized by articles I’d written on the same topic or by user-generated content.

Are you sure it’s a Google update?

You can never be 100% sure, but I haven’t made any changes to this area for several months, so I wouldn’t expect it to be due to recent changes, or delayed changes coming through. The site had recently migrated to HTTPS, but saw no traffic fluctuations around that time.

Currently, I don’t have anything else to attribute this to but the update.

How am I trying to fix this?

The ideal fix would be the one that gets me all my traffic back. But that’s a little more subjective than “I want the correct page to rank for the correct keyword,” so instead that’s what I’m aiming for here.

And of course the crucial word in all this is “trying”; I’ve only started making these changes recently, and the jury is still out on if any of it will work.

No-indexing the user generated content

This one seems like a bit of no-brainer. They bring an incredibly small percentage of traffic anyway, which then performs worse than if users land on a proper landing page.

I liked having them indexed because they would occasionally start ranking for some keyword ideas I’d never have tried by myself, which I could then migrate to the landing pages. But this was a relatively low occurrence and on-balance perhaps not worth doing any more, if I’m going to suffer cannabalization on my main pages.

Making better use of the “About” property

I’ve been waiting a while for a compelling place to give this idea a shot.

Broadly, you can sum it up as using the About property pointing back to multiple authoritative sources (like Wikidata, Wikipedia, Dbpedia, etc.) in order to help Google better understand your content.

For example, you might add the following JSON to an article an about Donald Trump’s inauguration.

            "@type": "Person",
            "name": "President-elect Donald Trump",
            "sameAs": [
            "@type": "Thing",
            "name": "US",
            "sameAs": [
            "@type": "Thing",
            "name": "Inauguration Day",
            "sameAs": [

The articles I’ve been having rank are often specific sub-articles about the larger topic, perhaps explicitly explaining them, which might help Google find better places to use them.

You should absolutely go and read this article/presentation by Jarno Van Driel, which is where I took this idea from.

Combining informational and transactional intents

Not quite sure how I feel about this one. I’ve seen a lot of it, usually where there exist two terms, one more transactional and one more informational. A site will put a large guide on the transactional page (often a category page) and then attempt to grab both at once.

This is where the lines started to blur. I had previously been on the side of having two pages, one to target the transactional and another to target the informational.

Currently beginning to consider whether or not this is the correct way to do it. I’ll probably try this again in a couple places and see how it plays out.

Final thoughts

I only got any insight into this problem because of storing Search Console data. I would absolutely recommend storing your Search Console data, so you can do this kind of investigation in the future. Currently I’d recommend paginating the API to get this data; it’s not perfect, but avoids many other difficulties. You can find a script to do that here (a fork of the previous Search Console script I’ve talked about) which I then use to dump into BigQuery. You should also check out Paul Shapiro and JR Oakes, who have both provided solutions that go a step further and also do the database saving.

My best guess at the moment for the Maccabees update is there has been some sort of weighting change which now values relevancy more highly and tests more pages which are possibly topically relevant. These new tested pages were notably less strong and seemed to perform as you would expect (less well), which seems to have led to my traffic drop.

Of course, this analysis is currently based off of a single site, so that conclusion might only apply to my site or not at all if there are multiple effects happening and I’m only seeing one of them.

Has anyone seen anything similar or done any deep diving into where this has happened on their site?


Spotting thin content & dodgy links

For those of you who are looking at new sites, there are some quick ways to dig into this.

For dodgy links:

  • Take a look at something like Searchmetrics/SEMRush and see if they’ve had any previous penguin drops.
  • Take a look into tools Majestic and Ahrefs. You can often get this free, Majestic will give you all the links for your domain for example if you verify.

For spotting thin content:

  • Run a crawl

    • Take a look at anything with a short word count; let’s arbitrarily say less than 400 words.
    • Look for heavy repetition in titles or meta descriptions.
    • Use the tree view (that you can find on Screaming Frog, for example) and drill down into where it has found everything. This will quickly let you see if there are pages where you don’t expect there to be any.
    • See if the number of URLs found is notably different to the indexed URL report.
  • Soon you will be able to take a look at Google’s new index coverage report. (AJ Kohn has a nice writeup here).
  • Browse around with an SEO chrome plugin that will show indexation. (SEO Meta in 1 Click is helpful, I wrote Traffic Light SEO for this, doesn’t really matter what you use though.)

Index bloat

The only real place to spot index bloat is the indexed URLs report in Search Console. Debugging it however is hard, I would recommend a combination of log files, “site:” searches in Google, and sitemaps when attempting to diagnose this.

If you can get them, the log files will usually be the most insightful.

Poor user experience/slow site

This is a hard one to judge. Virtually every site has things you can class as a poor user experience.

If you don’t have access to any user research on the brand, I will go off my gut combined with a quick scan to compare to some competitors. I’m not looking for a perfect experience or anywhere close, I just want to not hate trying to use the website on the main templates which are exposed to search.

For speed, I tend to use WebPageTest as a super general rule of thumb. If the site loads below 3 seconds, I’m not worried; 3–6 I’m a little bit more nervous; anything over that, I’d take as being pretty bad.

I realize that’s not the most specific section and a lot of these checks do come from experience above everything else.

Overbearing ads or monetization?

Speaking of poor user experience, the most obvious one is to switch off whatever ad-block you’re running (or if it’s built into your browser, to switch to one without that feature) and try to use the site without it. For many sites, it will be clear cut. When it’s not, I’ll go off and seek other specific examples.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Source: Moz

Should SEOs & Content Marketers Play to the Social Networks’ "Stay-On-Our-Site" Algorithms? – Whiteboard Friday 0

Posted by randfish

Increasingly, social networks are tweaking their algorithms to favor content that remains on their site, rather than send users to an outside source. This spells trouble for those trying to drive traffic and visitors to external pages, but what’s an SEO or content marketer to do? Do you swim with the current, putting all your efforts toward placating the social network algos, or do you go against it and continue to promote your own content? This edition of Whiteboard Friday goes into detail on the pros and cons of each approach, then gives Rand’s recommendations on how to balance your efforts going forward.

Should SEOs and content marketers play to the social networks "stay-on-our-site" algorithms?

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about whether SEOs and content marketers, for that matter, should play to what the social networks are developing in their visibility and engagement algorithms, or whether we should say, “No. You know what? Forget about what you guys are doing. We’re going to try and do things on social networks that benefit us.” I’ll show you what I’m talking about.


If you’re using Facebook and you’re posting content to it, Facebook generally tends to frown upon and lower the average visibility and ability of content to reach its audience on Facebook if it includes an external link. So, on average, posts that include an external link will fare more poorly in Facebooks’ news feed algorithm than on-site content, exclusively content that lives on Facebook.

For example, if you see this video promoted on or, it will do more poorly than if Moz and I had promoted a Facebook native video of Whiteboard Friday. But we don’t want that. We want people to come visit our site and subscribe to Whiteboard Friday here and not stay on Facebook where we only reach 1 out of every 50 or 100 people who might subscribe to our page.

So it’s clearly in our interest to do this, but Facebook wants to keep you on Facebook’s website, because then they can do the most advertising and targeting to you and get the most time on site from you. That’s their business, right?


The same thing is true of Twitter. So it tends to be the case that links off Twitter fare more poorly. Now, I am not 100% sure in Twitter’s case whether this is algorithmic or user-driven. I suspect it’s a little of both, that Twitter will promote or make most visible to you when you log in to Twitter the posts that have been made or the tweets that have been made that are self-contained. They live entirely on Twitter. They might contain a bunch of different stuff, a poll or images or be a thread. But links off Twitter will be dampened.


The same thing is true on Instagram. Well, on Instagram, they’re kind of the worst. They don’t allow links at all. The only thing you can do is a link in profile. More engaging content on Instagram, as of just a couple weeks ago, more engaging content equals higher placement in the feed. In fact, Instagram has now just come out and said that they will show you content posts from people you’re not following but that they think will be engaging to you, which gives influential Instagram accounts that get lots of engagement an additional benefit, but kind of hurts everyone else that you’re normally following on the network.


LinkedIn, LinkedIn’s algorithm includes extra visibility in the feed for self-contained post content, which is why you see a lot of these posts of, “Oh, here’s all the crazy amounts of work I did and what my experience was like building this or doing that.” If it’s a self-contained, sort of blog post-style content in LinkedIn that does not link out, it will do much better than posts that contain an external link, which LinkedIn sort of dampens in their visibility algorithm for their feed.

Play to the algos?

So all of these sites have these components of their algorithm that basically reward you if you are willing to play to their algos, meaning you keep all of the content on their sites and platform, their stuff, not yours. You essentially play to what they’re trying to achieve, which is more time on site for them, more engagement for them, less people going away to other places. You refuse or you don’t link out, so no external linking to other places. You maintain sort of what I call a high signal to noise ratio, so that rather than sharing all the things you might want to share, you only share posts that you can count on having relatively high engagement.

That track record is something that sticks with you on most of these networks. Facebook, for example, if I have posts that do well, many in a row, I will get more visibility for my next one. If my last couple of posts have performed poorly on Facebook, my next one will be dampened. You sort of get a string or get on a roll with these networks. Same thing is true on Twitter, by the way.

$#@! the algos, serve your own site?

Or you say, “Forget you” to the algorithms and serve your own site instead, which means you use the networks to tease content, like, “Here’s this exciting, interesting thing. If you want the whole story or you want to watch full video or see all the graphs and charts or whatever it is, you need to come to our website where we host the full content.” You link externally so that you’re driving traffic back to the properties that you own and control, and you have to be willing to promote some potentially promotional content, in order to earn value from these social networks, even if that means slightly lower engagement or less of that get-on-a-roll reputation.

My recommendation

The recommendation that I have for SEOs and content marketers is I think we need to balance this. But if I had to, I would tilt it in favor of your site. Social networks, I know it doesn’t seem this way, but social networks come and go in popularity, and they change the way that they work. So investing very heavily in Facebook six or seven years ago might have made a ton of sense for a business. Today, a lot of those investments have been shown to have very little impact, because instead of reaching 20 or 30 out of 100 of your followers, you’re reaching 1 or 2. So you’ve lost an order of magnitude of reach on there. The same thing has been true generally on Twitter, on LinkedIn, and on Instagram. So I really urge you to tilt slightly to your own site.

Owned channels are your website, your email, where you have the email addresses of the people there. I would rather have an email or a loyal visitor or an RSS subscriber than I would 100 times as many Twitter followers, because the engagement you can get and the value that you can get as a business or as an organization is just much higher.

Just don’t ignore how these algorithms work. If you can, I would urge you to sometimes get on those rolls so that you can grow your awareness and reach by playing to these algorithms.

So, essentially, while I’m urging you to tilt slightly this way, I’m also suggesting that occasionally you should use what you know about how these algorithms work in order to grow and accelerate your growth of followers and reach on these networks so that you can then get more benefit of driving those people back to your site. You’ve got to play both sides, I think, today in order to have success with the social networks’ current reach and visibility algorithms.

All right, everyone, look forward to your comments. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Source: Moz

Meet Main Street Hub’s Photographers 0

Our photographers help tell our customers’ stories

At Main Street Hub, we have a passion for telling our local business customers’ stories through authentic content that is true to their brand voice and identity.

One of the ways we do this is by offering our customers a professional photo shoot when they partner with us.

The photo shoots, powered by Main Street Hub’s nationwide network of experienced freelance photographers, enable small businesses to elevate their online presence and win new customers with compelling photographs that tell their story.

Get to know some of the talented photographers behind our customers’ beautiful photography:

Savannah Claude-Pierre

Atlanta, GA

Joined the team in June 2017
Follow her on Instagram:

What’s your favorite thing about photographing local businesses?

“My favorite thing about photographing local businesses is making connections within the community, getting the opportunity to check out new places, and meeting great people along the way.”

What’s your favorite photo shoot memory with a Main Street Hub customer?

“My favorite memory was at a pet care center, the owner brought out a ferret for me to photograph! Also, when restaurants offer me dishes after I photograph them… doesn’t get better than that!”

Savannah Claude-Pierre for Main Street Hub

Matt Young

Phoenix, AZ

Joined the team in February 2017
Follow him on Instagram:

What’s your favorite thing about photographing local businesses?

“My favorite thing about shooting for Main Street Hub is knowing that I’m part of the process of building a social awareness for local businesses. On a personal note, I would have to say that I become aware of some of the local businesses that are in my own community, and I can help spread the word about their business.”

What’s your favorite photo shoot memory with a Main Street Hub customer?

“My favorite has to be the shoot I did for Desert Wolf Tours in New River, AZ. By far my most fun shoot as I got to go on a jeep tour, and thanks to the tour guides, learned a little about survival in the desert. Did you know you can actually eat a cactus? I can’t recall which one though.

Additionally, there is a photo of a jumping cactus latched onto my shoe. I now know why they call it a jumping cactus, but don’t worry, no injury became of it.”

Matt Young for Main Street Hub

Brittany Castillo

Wilmington, NC

Joined the team in October 2016
Follow her on Instagram:

What’s your favorite thing about photographing local businesses?

“I love to support local businesses. Any chance I get to buy local I take it — it’s good for the community, and it’s good for the environment! I love that I am able to help these local businesses to create compelling, custom visual content. By having high-quality, authentic photos, I know these businesses will stand out among the competition and continue to grow their local brand and garner the attention they deserve.”

What’s your favorite photo shoot memory with a Main Street Hub customer?

“There have been many great photo shoots, but one business that really stands out is Burnt Mill Creek. It’s a hole-in-the-wall bar in a more quiet part of town, but it has an amazing atmosphere. They were hosting their weekly jazz night, and the crowd really felt more like old friends than complete strangers. With a pool table, a jukebox, and a unique, copper penny bar top, this place has a lot of character, and plenty of charm!”

Nathan Zucker

Nashville, TN

Joined the team in February 2017
Follow him on Instagram:

What’s your favorite thing about photographing local businesses?

“Getting to know the people and businesses that are in my area. I’ve been introduced to a lot of local businesses I wouldn’t otherwise have known about.”

What’s your favorite photo shoot memory with a Main Street Hub customer?

“My favorite shoot with Main Street Hub was with my mechanic, J & J Auto Care [in Madison, TN] just a mile from my house. It was pretty cool to be there for work instead of car troubles, and we’ve all become good friends since!”

Nathan Zucker for Main Street Hub

Meet more talented photographers in our network in this blog post.

Are you a photographer who loves helping local businesses? Submit your portfolio to join our freelance photographer network here.

Learn more about our customers and team by following us on Twitter, Facebook, LinkedIn, and Instagram!

Meet Main Street Hub’s Photographers was originally published in Main Street Hub on Medium, where people are continuing the conversation by highlighting and responding to this story.

Source: Main Street Hub

3 Ways To Protect Your Brand in 2018 0

In a world where brands are everywhere you look, standing out can be difficult, and maintaining a good standing in the eyes of society can be an even grander task. First of all, crafting up a brand in today’s digital world requires various steps and countless hours of invigorating planning. For up-and-coming business owners, the fear of failure along with the risk of putting themselves out there in a new, innovative and enticing way can be immobilizing. Then when you think about the plethora of online review platforms scattered amongst the web; you may start doubting your decision to start and manage a brand.

However, with the growing number of brands popping up globally comes a growing network of valuable resources teaching and guiding new business owners tricks of the trade. By learning from others mistakes, you can set your brand up for success, and protect that success with ease. Ensuring your brand is in good standing online is only half of the battle. The best place to start when it comes to protecting your brand is with an online reputation management software such as ReviewPush to help monitor your brand’s online activity all around the world.

By learning from others mistakes, you can set your brand up for success.
Click To Tweet

But what do you do once you’ve signed up with an online reputation monitoring software? The answer is simple: you start utilizing it — heavily — in every way possible. You may be asking yourself, “Lee, what does that mean? What does that entail?” If you find yourself in this or a similar position, wondering where to go and what to do in the world of online reputation management; search no further. There are three simple moves your brand can make to ensure you’re protected from all angles at all hours of the day.

1.Respond Promptly

No matter the platform you’re being contacted on or discussed within, it’s best practice to respond as quickly as you can. This not only shows that you value your business and your customers, but responding promptly shows that your brand is listening; something crucial in the eyes of consumers. If you receive a negative review, responding as soon as it comes in is commonly known as best practice for protecting your brand, as you can confront any issues and resolve any complications before they get out of hand.

Responding promptly shows that your brand is listening.
Click To Tweet

2.Utilize Thought Leadership

With so many versatile platforms, it’s very plausible to become a thought leader in your field; and others, too. By continuously creating positive content for consumers and like-minded businesses to intake and digest, you’re setting yourself apart as a leader in your field, and open the door for more business opportunities. When you simply and clearly articulate the benefits of the craft you’ve honed in on for years, it’s going to not only be well-received by a diverse group of readers; it will also put you ahead of your competition while ensuring your brand’s content is enforcing your brand’s reputation. By opening up the door to thought leadership, you’re able to answer the questions and concerns within your industry and amongst consumers who are interested in investing in your business. This is also a great place to utilize search engine optimization(SEO) by tagging thought pieces with vital keywords. You’ll be benefiting your SEO ranking all the while setting the groundwork to become a powerhouse in your, and various other, vertical(s).

By continuously creating positive content for consumers, you set yourself up as a thought leader.
Click To Tweet

3. Be Active On Social

The key with social is engagement — always. Throughout the various social platforms that exist, social media is one of the easiest ways to engage with the community you serve and catch the attention of potential clients. This creates brand awareness while also showcasing your brand in the best possible light. People like brands they can rely on, and that applies to social media, as well. If a consumer has a question regarding your business, they’re likely going to seek you out on social media and look for an answer, or ask you for one. If you’re not monitoring your social channels, you could miss out on golden opportunities for new business, or even a chance to right a wrong before consumers head to a review site to talk about their experience with the rest of the world. Listen — I get it. Managing your social media accounts may be low on your priority list as a business owner, but with tools such as SproutSocial, Buffer and HootSuite,  you can schedule out posts to multiple social channels weeks or months in advance; so there’s really no excuse to neglecting social media. Even better: hire someone for the sole purpose of managing social media. Anyone who has worked with social media knows there are enough aspects in it to make it a job of its own, hence why companies have begun hiring social media and community managers.

If you’re not monitoring your social channels, you could miss out on opportunities for new…
Click To Tweet

Brand management and reputation monitoring can seem disconcerting, but with the right approach and the consumers’ best interest in mind, you’ll form a profitable and successful brand in the public eye in no time. Sometimes, you need a little extra help. For times like this, ReviewPush is here for you, your company, your customers and ultimately: your brand.

The post 3 Ways To Protect Your Brand in 2018 appeared first on ReviewPush.

Source: Review Push