As a reputation management pioneer, Nick has the inside scoop on all things Reputation Management. This blog will focus on Reputation, practices, technologies, providers and re-shared content from some of the preeminent players in the industry. We hope you enjoy!

Posts By: Curator

Rewriting the Beginner’s Guide to SEO, Chapter 1: SEO 101

Posted by BritneyMuller

Back in mid-November, we kicked off a campaign to rewrite our biggest piece of content: the Beginner’s Guide to SEO. You offered up a huge amount of helpful advice and insight with regards to our outline, and today we’re here to share our draft of the first chapter.

In many ways, the Beginner’s Guide to SEO belongs to each and every member of our community; it’s important that we get this right, for your sake. So without further ado, here’s the first chapter — let’s dive in!


Chapter 1: SEO 101

What is it, and why is it important?

Welcome! We’re excited that you’re here!

If you already have a solid understanding of SEO and why it’s important, you can skip to Chapter 2 (though we’d still recommend skimming the best practices from Google and Bing at the end of this chapter; they’re useful refreshers).

For everyone else, this chapter will help build your foundational SEO knowledge and confidence as you move forward.

What is SEO?

SEO stands for “search engine optimization.” It’s the practice of increasing both the quality and quantity of website traffic, as well as exposure to your brand, through non-paid (also known as “organic”) search engine results.

Despite the acronym, SEO is as much about people as it is about search engines themselves. It’s about understanding what people are searching for online, the answers they are seeking, the words they’re using, and the type of content they wish to consume. Leveraging this data will allow you to provide high-quality content that your visitors will truly value.

Here’s an example. Frankie & Jo’s (a Seattle-based vegan, gluten-free ice cream shop) has heard about SEO and wants help improving how and how often they show up in organic search results. In order to help them, you need to first understand their potential customers:

  • What types of ice cream, desserts, snacks, etc. are people searching for?
  • Who is searching for these terms?
  • When are people searching for ice cream, snacks, desserts, etc.?
    • Are there seasonality trends throughout the year?
  • How are people searching for ice cream?
    • What words do they use?
    • What questions do they ask?
    • Are more searches performed on mobile devices?
  • Why are people seeking ice cream?
    • Are individuals looking for health conscious ice cream specifically or just looking to satisfy a sweet tooth?
  • Where are potential customers located — locally, nationally, or internationally?

And finally — here’s the kicker — how can you help provide the best content about ice cream to cultivate a community and fulfill what all those people are searching for?

Search engine basics

Search engines are answer machines. They scour billions of pieces of content and evaluate thousands of factors to determine which content is most likely to answer your query.

Search engines do all of this by discovering and cataloguing all available content on the Internet (web pages, PDFs, images, videos, etc.) via a process known as “crawling and indexing.”

What are “organic” search engine results?

Organic search results are search results that aren’t paid for (i.e. not advertising). These are the results that you can influence through effective SEO. Traditionally, these were the familiar “10 blue links.”

Today, search engine results pages — often referred to as “SERPs” — are filled with both more advertising and more dynamic organic results formats (called “SERP features”) than we’ve ever seen before. Some examples of SERP features are featured snippets (or answer boxes), People Also Ask boxes, image carousels, etc. New SERP features continue to emerge, driven largely by what people are seeking.

For example, if you search for “Denver weather,” you’ll see a weather forecast for the city of Denver directly in the SERP instead of a link to a site that might have that forecast. And, if you search for “pizza Denver,” you’ll see a “local pack” result made up of Denver pizza places. Convenient, right?

It’s important to remember that search engines make money from advertising. Their goal is to better solve searcher’s queries (within SERPs), to keep searchers coming back, and to keep them on the SERPs longer.

Some SERP features on Google are organic and can be influenced by SEO. These include featured snippets (a promoted organic result that displays an answer inside a box) and related questions (a.k.a. “People Also Ask” boxes).

It’s worth noting that there are many other search features that, even though they aren’t paid advertising, can’t typically be influenced by SEO. These features often have data acquired from proprietary data sources, such as Wikipedia, WebMD, and IMDb.

Why SEO is important

While paid advertising, social media, and other online platforms can generate traffic to websites, the majority of online traffic is driven by search engines.

Organic search results cover more digital real estate, appear more credible to savvy searchers, and receive way more clicks than paid advertisements. For example, of all US searches, only ~2.8% of people click on paid advertisements.

In a nutshell: SEO has ~20X more traffic opportunity than PPC on both mobile and desktop.

SEO is also one of the only online marketing channels that, when set up correctly, can continue to pay dividends over time. If you provide a solid piece of content that deserves to rank for the right keywords, your traffic can snowball over time, whereas advertising needs continuous funding to send traffic to your site.

Search engines are getting smarter, but they still need our help.

Optimizing your site will help deliver better information to search engines so that your content can be properly indexed and displayed within search results.

Should I hire an SEO professional, consultant, or agency?

Depending on your bandwidth, willingness to learn, and the complexity of your website(s), you could perform some basic SEO yourself. Or, you might discover that you would prefer the help of an expert. Either way is okay!

If you end up looking for expert help, it’s important to know that many agencies and consultants “provide SEO services,” but can vary widely in quality. Knowing how to choose a good SEO company can save you a lot of time and money, as the wrong SEO techniques can actually harm your site more than they will help.

White hat vs black hat SEO

“White hat SEO” refers to SEO techniques, best practices, and strategies that abide by search engine rule, its primary focus to provide more value to people.

“Black hat SEO” refers to techniques and strategies that attempt to spam/fool search engines. While black hat SEO can work, it puts websites at tremendous risk of being penalized and/or de-indexed (removed from search results) and has ethical implications.

Penalized websites have bankrupted businesses. It’s just another reason to be very careful when choosing an SEO expert or agency.

Search engines share similar goals with the SEO industry

Search engines want to help you succeed. They’re actually quite supportive of efforts by the SEO community. Digital marketing conferences, such as Unbounce, MNsearch, SearchLove, and Moz’s own MozCon, regularly attract engineers and representatives from major search engines.

Google assists webmasters and SEOs through their Webmaster Central Help Forum and by hosting live office hour hangouts. (Bing, unfortunately, shut down their Webmaster Forums in 2014.)

While webmaster guidelines vary from search engine to search engine, the underlying principles stay the same: Don’t try to trick search engines. Instead, provide your visitors with a great online experience.

Google webmaster guidelines

Basic principles:

  • Make pages primarily for users, not search engines.
  • Don’t deceive your users.
  • Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website to a Google employee. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?”
  • Think about what makes your website unique, valuable, or engaging.

Things to avoid:

  • Automatically generated content
  • Participating in link schemes
  • Creating pages with little or no original content (i.e. copied from somewhere else)
  • Cloaking — the practice of showing search engine crawlers different content than visitors.
  • Hidden text and links
  • Doorway pages — pages created to rank well for specific searches to funnel traffic to your website.

Full Google Webmaster Guidelines version here.

Bing webmaster guidelines

Basic principles:

  • Provide clear, deep, engaging, and easy-to-find content on your site.
  • Keep page titles clear and relevant.
  • Links are regarded as a signal of popularity and Bing rewards links that have grown organically.
  • Social influence and social shares are positive signals and can have an impact on how you rank organically in the long run.
  • Page speed is important, along with a positive, useful user experience.
  • Use alt attributes to describe images, so that Bing can better understand the content.

Things to avoid:

  • Thin content, pages showing mostly ads or affiliate links, or that otherwise redirect visitors away to other sites will not rank well.
  • Abusive link tactics that aim to inflate the number and nature of inbound links such as buying links, participating in link schemes, can lead to de-indexing.
  • Ensure clean, concise, keyword-inclusive URL structures are in place. Dynamic parameters can dirty up your URLs and cause duplicate content issues.
  • Make your URLs descriptive, short, keyword rich when possible, and avoid non-letter characters.
  • Burying links in Javascript/Flash/Silverlight; keep content out of these as well.
  • Duplicate content
  • Keyword stuffing
  • Cloaking — the practice of showing search engine crawlers different content than visitors.

Guidelines for representing your local business on Google

These guidelines govern what you should and shouldn’t do in creating and managing your Google My Business listing(s).

Basic principles:

  • Be sure you’re eligible for inclusion in the Google My Business index; you must have a physical address, even if it’s your home address, and you must serve customers face-to-face, either at your location (like a retail store) or at theirs (like a plumber)
  • Honestly and accurately represent all aspects of your local business data, including its name, address, phone number, website address, business categories, hours of operation, and other features.

Things to avoid

  • Creation of Google My Business listings for entities that aren’t eligible
  • Misrepresentation of any of your core business information, including “stuffing” your business name with geographic or service keywords, or creating listings for fake addresses
  • Use of PO boxes or virtual offices instead of authentic street addresses
  • Abuse of the review portion of the Google My Business listing, via fake positive reviews of your business or fake negative ones of your competitors
  • Costly, novice mistakes stemming from failure to read the fine details of Google’s guidelines

Fulfilling user intent

Understanding and fulfilling user intent is critical. When a person searches for something, they have a desired outcome. Whether it’s an answer, concert tickets, or a cat photo, that desired content is their “user intent.”

If a person performs a search for “bands,” is their intent to find musical bands, wedding bands, band saws, or something else?

Your job as an SEO is to quickly provide users with the content they desire in the format in which they desire it.

Common user intent types:

Informational: Searching for information. Example: “How old is Issa Rae?”

Navigational: Searching for a specific website. Example: “HBOGO Insecure”

Transactional: Searching to buy something. Example: “where to buy ‘We got y’all’ Insecure t-shirt”

You can get a glimpse of user intent by Googling your desired keyword(s) and evaluating the current SERP. For example, if there’s a photo carousel, it’s very likely that people searching for that keyword search for photos.

Also evaluate what content your top-ranking competitors are providing that you currently aren’t. How can you provide 10X the value on your website?

Providing relevant, high-quality content on your website will help you rank higher in search results, and more importantly, it will establish credibility and trust with your online audience.

Before you do any of that, you have to first understand your website’s goals to execute a strategic SEO plan.

Know your website/client’s goals

Every website is different, so take the time to really understand a specific site’s business goals. This will not only help you determine which areas of SEO you should focus on, where to track conversions, and how to set benchmarks, but it will also help you create talking points for negotiating SEO projects with clients, bosses, etc.

What will your KPIs (Key Performance Indicators) be to measure the return on SEO investment? More simply, what is your barometer to measure the success of your organic search efforts? You’ll want to have it documented, even if it’s this simple:

For the website ________________________, my primary SEO KPI is _______________.

Here are a few common KPIs to get you started:

  • Sales
  • Downloads
  • Email signups
  • Contact form submissions
  • Phone calls

And if your business has a local component, you’ll want to define KPIs for your Google My Business listings, as well. These might include:

  • Clicks-to-call
  • Clicks-to-website
  • Clicks-for-driving-directions

Notice how “Traffic” and “Ranking” are not on the above lists? This is because, for most websites, ranking well for keywords and increasing traffic won’t matter if the new traffic doesn’t convert (to help you reach the site’s KPI goals).

You don’t want to send 1,000 people to your website a month and have only 3 people convert (to customers). You want to send 300 people to your site a month and have 40 people convert.

This guide will help you become more data-driven in your SEO efforts. Rather than haphazardly throwing arrows all over the place (and getting lucky every once in awhile), you’ll put more wood behind fewer arrows.

Grab a bow (and some coffee); let’s dive into Chapter 2 (Crawlers & Indexation).


We’re looking forward to hearing your thoughts on this draft of Chapter 1. What works? Anything you feel could be added or explained differently? Let us know your suggestions, questions, and thoughts in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Source: Moz

#44 – Which Super Bowl ads were #ReputationRoadkill and which were #ReputationRainmakers?


Subscribe: Google Play | iTunes | RSS

Some were cute, some were thought-provoking, and some were just weird, but which ones had the most impact on the advertiser’s brand?

Each week, Erin Jones and I take a look at the most interesting reputation management stories, answer your questions, and share valuable ORM tactics. In this week’s episode:

  • We rate the reputation impact of some of the most talked-about Super Bowel ads of 2018.

If you have a question you would like us to tackle, please leave a comment below or on my Facebook Page.

Transcript (forgive us for any typos):
Coming Soon!

The post #44 – Which Super Bowl ads were #ReputationRoadkill and which were #ReputationRainmakers? appeared first on Andy Beal .


Source: Andy Beal

Twitter Defamation Claim Defeated by a Question Mark–Boulger v. Woods

Screen Shot 2016-02-21 at 10.05.27 AMThis is a defamation lawsuit brought against James Woods by a woman (Portia Boulger) who was wrongly identified as a Nazi supporter online. In March, candidate Trump had a rally in Chicago. The Tribune posted a photo of a woman at the rally giving the Nazi salute. The next day “@voxday” posted the photograph, along with a photograph of plaintiff identifying plaintiff as “Organizer (Women for Bernie).”

Screen Shot 2018-02-04 at 12.19.44 PM

Woods followed up Voxday’s tweet with the one below:

Screen Shot 2018-02-04 at 12.20.02 PM

The woman in at the rally was later identified as “Birgitt Peterseon”. Woods tweeted a correction, although he did not delete his original tweet.

Boulger sued Woods in District Court in Ohio. Woods filed an answer, a motion for judgment on the pleadings, and after the time for service had expired, a motion for summary judgment or for dismissal for failure to perfect service.

The court first says that Woods waived his defense of insufficient process and lack of personal jurisdiction by not raising in his motion for judgment on the pleadings. The court’s discussion is probably of interest to Rule 12 geeks, but I’ll spare the rest of us.

Moving on to the merits, the court says that the key question is whether Woods’s tweet constitutes a statement of fact. Ohio courts employ a totality of circumstances analysis where the author is charged with knowing the perspective of the reasonable reader. The court says that the question mark is decisive:

Were it not for the question mark at the end of the text, this would be an easy case. Woods phrased his tweet in an uncommon syntactical structure for a question in English by making what would otherwise be a declarative statement and placing a question mark at the end. Delete the question mark, and the reader is left with an ambiguous statement of fact . . . But the question mark cannot be ignored.

The court says “inquiry itself . . . is not accusation”. That is not to say that a question will automatically insulate the author. But the court says the First Amendment requires the author to receive the benefit of any plausible innocent interpretation. (Interestingly, the court footnotes and rejects an argument that the term “Nazi” is “not actionable as a matter of law”. The court distinguishes the tweet here in that someone is being accused of literally being a Nazi.)

The court also says that the context matters. However, the court struggles with whether readers should automatically be charged with reviewing Woods’s other tweets during this time-period, or his tweets generally. The court says that a reader would not review an entire twitter account in chronological order. In some situations, authors flag that they are making a series of tweets (“1/x”—a tweetstorm!) but typically a reader is exposed to a “changing, disjointed series of brief messages on multiple topics by multiple authors.” Ultimately the court says that context is not determinative and the court can’t reach any conclusions regarding it. However, at least some of the readers could interpret the tweet as not being a declarative statement of fact. This further points in the direction of the tweet not being actionable.

The court also dismisses the invasion of privacy claim on similar grounds.

__

It’s ironic that Woods, who himself sued a twitter user over being called a “cocaine addict” by an anonymous and hyperbolic twitter user, benefited from the context rule. Background on Woods’s lawsuit from Popehat and Eriq Gardner.

We’ve blogged a bunch about courts’ treatment of defamation claims premised on online content. Twitter in particular lends itself to a medium where people understand statements to include rhetoric and hyperbole. This case brought to mind Feld v. Conway, where the court said that calling someone “fucking crazy” in a tweet was not actionable.

Twitter defamation cases are fascinating, simply because of the nature of the medium itself. Woods could have used the Retweet button. He could have also used an emoji. 🤔

Case citation: Boulger v. Woods, 2:17-cv-186 (S.D. Oh. Jan. 24 2018)

Related posts:

Twibel Ruling: Tweeting That Someone is “Fucking Crazy” is Not Defamatory

Hyperlinking to Sources Can Help Defeat Defamation Claims–Adelson v. Harris

Using Links as Citations Helps Gizmodo Defeat a Defamation Claim–Redmond v. Gawker Media

Protip: Don’t Send Emails Threatening to “Inflict the Maximum Amount of Financial Pain” Allowed By Law

Want To Avoid Defaming Someone Online? Link To Your Sources (Forbes Cross-Post)

Social Media Rant Against Airline Employee Wasn’t Defamatory But May Be False Light–Patterson v. Grant-Herms

Calling Out Scraper for “Stealing” Data Is Not Defamatory – Tamburo v. Dworkin

A Twitter Exception for Defamation?

9th Circuit Issues a Blogger-Friendly First Amendment Opinion–Obsidian Finance v. Cox


Source: Eric Goldman Legal

Franchise Networks and the Battle for Brand Authenticity at the Local Level

The struggle to keep franchisees’ social media and online reviews on-brand

By Mike Georgoff, Chief Product Officer

It’s challenging to create a comprehensive social media and reputation strategy for one business, and marketers for franchise networks know it’s even more difficult to set up a strategy that masters micro-moments for the brand overall, as well as individual franchisees located across the country.

For brand networks, their biggest battle is establishing an authentic brand identity for their corporate location and ensuring that brand is reflected at the franchisee level, while at the same time, maintaining each individual location’s local feel.

So, how can marketers create a strategy where the brand network’s franchisees are authentic and on-brand, but also localized and genuine? How can they be certain that every customer service request and review is answered in a way that will resonate with the local community it serves? Marketers first need to overcome these 4 main challenges:

Challenge #1: Posting authentic, localized content for each franchisee

Marketers need to implement a comprehensive content strategy for each franchisee’s pages on Facebook, Twitter, Instagram. Bob’s Gym — Cincinnati and Bob’s Gym — Cleveland should both contain content that come from corporate across these platforms (like graphics, information, and news), but marketers should have a content calendar in place for each franchisee (in line with corporate brand, style, voice and content guidelines). These calendars will indicate what localized content for local fans and followers should be posted on what days to increase the maximum amount of engagement. Since local fans are the target audience for each franchisee, marketers should be monitoring when these fans interact most with franchisee’s social posts to reach potential new customers in the community as well.

Note: Main Street Hub can become a distribution engine for corporate level content as well as a content creation engine for local content. More info about how we help franchisors here.

Challenge #2: Entrusting franchisees with local social media content

When maintaining the online presence of franchisees, marketers for brand networks might think it’s a good idea to assign designated team members at each location to post on social media — with the support of corporate guidelines, templates, and an image library of approved graphics. But, this can be difficult if these team members aren’t experts at using social media for marketing and don’t understand why posting on social media is essential for the brand.

It’s also incredibly difficult to quality control — franchisees could use the wrong fonts, inappropriate posts, off-brand images or gifs, and much more. This unknown factor is what marketers fear most, since they are there to ensure consistency across the board in order to align with the brand’s voice and message.

Having a marketing strategy from corporate and entrusting local marketers at the franchisee level might not be enough to distribute and monitor quality content.

Note: Main Street Hub empowers franchisors to execute their brand vision and strategy at the local franchisee level, across thousands of social media touchpoints — blending brand compliance and local authenticity to power an exceptional digital consumer experience.

Challenge #3: Responding to reviews for each franchisee

One of the hardest things for marketers to keep consistent across franchisees are review pages. Feedback on these sites can often be charged, and situations can escalate quickly, especially if an appropriate response is not issued in a timely manner. Plus, one negative occurrence at one location or miscommunication on the part of one franchisee owner can affect other franchisees and reach all the way to corporate.

Marketers need to ensure that their brand networks have a reputation management strategy in place that trickles down to each franchisee in order to handle these incidents and keep each response on each page in line with the overall brand.

Similar to Challenge #2 above, it may seem like the best idea to leave review responses up to each franchisee. But if it’s risky to allow franchisee team members to take the reins for local social media accounts, it’s even riskier to allow team members, who are often not customer service experts, to be the voice of the company. This is true because responses are public, and ultimately affect the brand’s overall image and reputation.

For instance, what if one franchisee responds to a negative review in an offensive or inappropriate manner? That could escalate onto other channels and affect how the public sees not just franchisee, but the overall brand. Marketers need to ensure that they are protecting their brand networks’ reputation 24/7 for each location.

Note: Main Street Hub’s deep understanding of the franchise landscape helps brand networks increase successful and sincere interactions with their customer base.

Challenge #4: Handling franchisees’ customer service requests on social media

As more consumers turn to social media for their customer service requests, it’s becoming increasingly difficult for marketers for brand networks to keep up. When marketers are controlling all franchisees’ customer requests on social media, it can feel like too many are coming in too quickly to manage properly and respond skillfully.

In fact, a majority of consumers report using Twitter and Facebook for customer service requests, and 33% of consumers even prefer to contact brands using social media rather than calling a business. If these communications are ignored or mishandled, that can cause negative perception for the franchisee and its brand network overall.

Here’s an example from Kogneta showing how a franchisee neglecting a negative comment on social media can affect the overall brand — they point to this example from a Dunkin’ Donuts franchisee’s Facebook page:

“Dunkin’ Donuts allows individual franchises to have accounts and this specific account is for a location in New York with a small following of 209 people. As you can see, a regular post was made by the location and then received a negative comment about that location, but from there the situation spread and people began to comment on Dunkin’ Donuts as a whole. It received 59 shares and not one comment from Dunkin’ Donuts. The corporate Facebook account would have been better prepared to do damage control, but instead this franchisee page didn’t even acknowledge it, which reflects upon the brand as a whole.”

Every review, message, and mention on Facebook, Twitter, and Instagram present new challenges for marketers, since each one has the power to influence consumers and their perspective, not just on the brand of their local franchisee, but the corporate brand presence across the board.

Note: Main Street Hub empower franchises with our one-of-a-kind technology to find, engage, and delight their customers, at both the brand and local level.

Challenge: The battle for brand authenticity at the local level

Solution: Main Street Hub

We’ve got you covered. We’ll take the brand network corporate vision and strategy and execute it across the network, across channels, with the right blend of brand compliance and local authenticity.

We’ll be at IFA 2018 at Booth 556, ready to chat with franchise professionals about the challenges franchises face in the world of social media and online reviews — and we’ll how we can take all of those challenges off your plate, for good.

Read the full press release on the launch of Main Street Hub’s new solution for franchise networks and learn more on our blog here.

Ready to partner with us? Get started with us here.

Follow us on Twitter, Facebook, LinkedIn, and Instagram!



Franchise Networks and the Battle for Brand Authenticity at the Local Level was originally published in Main Street Hub on Medium, where people are continuing the conversation by highlighting and responding to this story.


Source: Main Street Hub

A Look Back at a Great 2017: 5 Major Moz Product Investments and a Sneak Peek Into 2018

Posted by adamf

It’s hard to believe that 2017 is already past. We entered the year with big ambitions and we’ve made some great strides. As has become tradition, I’ve compiled a rundown of some of the most interesting updates that you may have seen (or missed) this past year. We’ve intentionally focused on significant product updates, but I’ve also shared a little about some newer programs that provide value for customers in different ways.

TL;DR, here are some of the larger and more interesting additions to Moz in 2017:

  1. Keywords by Site: Keyword Explorer adds site-based keyword research and competitive intelligence
  2. Site Crawl V2: Overhauled Site Crawl for better auditing and workflow
  3. Major investments in infrastructure: Better performance and resilience across the Moz toolset
  4. New instructor-led training programs: Targeted classes to level-up your SEO knowledge
  5. Customer Success: Custom walkthroughs to help you get the most out of Moz
  6. Bonus! MozPod: Moz’s new free podcast keeps you up to date on the latest industry topics and trends

Big updates

This year and last, we’ve been spending a disproportionate focus on releasing large infrastructural improvements, new datasets, and foundational product updates. We feel these are crucial elements that serve the core needs of SEOs and will fuel frequent improvements and iterations for years to come.

To kick things off, I wanted to share some details about two big updates from 2017.


1) Keywords by Site: Leveling up keyword research and intelligence

Rank tracking provides useful benchmarks and insights for specific, targeted keywords, but you can’t track all of the keywords that are relevant to you. Sometimes you need a broader look at how visible your sites (and your competitors’ sites) are in Google results.

We built Keywords by Site to provide this powerful view into your Google presence. This brand-new dataset in Moz significantly extends Keyword Explorer and improves the quality of results in many other areas throughout Moz Pro. Our US corpus currently includes 40 million Google SERPs updated every two weeks, and allows you to do the following:

See how visible your site is in Google results

This view not only shows how authoritative a site is from a linking perspective, but also shows how prominent a site is in Google search results.

Compare your ranking prominence to your competitors

Compare up to three sites to get a feel for their relative scale of visibility and keyword ranking overlap. Click on any section in the Venn diagram to view the keywords that fall into that section.

Dig deep: Sort, filter, and find opportunities, then stash them in keyword lists

For example, let’s say you’re looking to determine which pages or content on your site might only require a little nudge to garner meaningful search visibility and traffic. Run a report for your site in Keyword Explorer and then use the filters to quickly hone in on these opportunities:

Our focus on data quality

We’ve made a few decisions to help ensure the freshness and accuracy of our keyword corpus. These extend the cost and work to maintain this dataset, but we feel they make a discernible difference in quality.

  • We recollect all of our keyword data every 2 weeks. This means that the results you see are more recent and more similar to the results on the day that you’re researching.
  • We cycle up to 15 million of our keywords out on a monthly basis. This means that as new keywords or terms trend up in popularity, we add them to our corpus, replacing terms that are no longer getting much search volume.

A few improvements we’ve made since launch:

  • Keyword recommendations in your campaigns (tracked sites) are much improved and now backed by our keyword corpus.
  • These keyword suggestions are also included in your weekly insights, suggesting new keywords worth tracking and pages worth optimizing.
  • Coming very soon: We’re also on the cusp of launching keyword corpuses for the UK, Canada, and Australia. Stay tuned.

A few resources to help you get more from Keywords by Site:

Try out Keywords by Site!


2) Site Crawl V2: Big enhancements to site crawling and auditing

Another significant project we completed in 2017 was a complete rewrite of our aging Site Crawler. In short, our new crawler is faster, more reliable, can crawl more pages, and surfaces more issues. We’ve also made some enhancements to the workflow, to make regular crawls more customizable and easy to manage. Here are a few highlights:

Week-over-week crawl comparisons

Our new crawler keeps tabs on what happened in your previous crawl to show you which specific issues are no longer present, and which are brand new.

Ignore (to hide) individual issues or whole issue types

This feature was added in response to a bunch of customer requests. While Moz does its best to call out the issues and priorities that apply to most sites, not all sites or SEOs have the same needs. For example, if you regularly noindex a big portion of your site, you don’t need us to keep reminding you that you’ve applied noindex to a huge number of pages. If you don’t want them showing your reports, just ignore individual issues or the entire issue type.

Another workflow improvement we added was the ability to mark an issue as fixed. This allows you to get it out of your way until the next crawl runs and verifies the fix.

All Pages view with improved sorting and filtering

If you’re prioritizing across a large number of pages or trying to track down an issue in a certain area of your site, you can now sort all pages crawled by Issue Count, Page Authority, or Crawl Depth. You can also filter to show, for instance, all pages in the /blog section of my site that are redirects, and have a crawl issue.

Recrawl to verify fixes

Moz’s crawler monitors your site by crawling it every week. But if you’ve made some changes and want to verify them, you can now recrawl your site in between regular weekly crawls instead of waiting for the next crawl the start.

Seven new issues checked and tracked

These include such favorites as detecting Thin Content, Redirect Chains, and Slow Pages. While we were at it, we revamped duplicate page detection and improved the UI to help you better analyze clusters of duplicate content and figure out which page should be canonical.

A few resources to help you get more from Site Crawl:


3) Major investments in infrastructure for performance and resilience

You may not have directly noticed many of the updates we’ve made this year. We made some significant investments in Moz Pro and Moz Local to make them faster, more reliable, and allow us to build new features more quickly. But here are a few tangible manifestations of these efforts:

“Infinite” history on organic Moz Pro search traffic reports

Okay, infinite is a bit of a stretch, but we used to only show the last 12 months or weeks of data. Now we’ll show data from the very inception of a campaign, broken down by weeks or months. This is made possible by an updated architecture that makes full historical data easy to surface and present in the application. It also allows for custom access to selected date ranges.

Also worth noting is that the new visualization shows how many different pages were receiving organic search traffic in context with total organic search traffic. This can help you figure out whether traffic increase was due to improved rankings across many pages, or just a spike in organic traffic for one or a few pages.

More timely and reliable access to Moz Local data at all scales

As Moz Local has brought on more and bigger customers with large numbers of locations, the team discovered a need to bolster systems for speed and reliability. A completely rebuilt scheduling system and improved core location data systems help ensure all of your data is collected and easy to access when you need it.

Improved local data distribution

Moz Local distributes your location data through myriad partners, each of which have their own formats and interfaces. The Local team updated and fine-tuned those third-party connections to improve the quality of the data and speed of distribution.


4) New instructor-led training programs: Never stop learning

Not all of our improvements this year have shown up in the product. Another investment we’ve made is in training. We’ve gotten a lot of requests for this over the years and are finally delivering. Brian Childs, our trainer extraordinaire, has built this program from the ground up. It includes:

  • Boot camps to build up core skills
  • Advanced Seminars to dig into more intensive topics
  • Custom Training for businesses that want a more tailored approach

We have even more ambitious plans for 2018, so if training interests you, check out all of our training offerings here.


5) Customer Success: Helping customers get the most out of Moz

Our customer success program took off this year and has one core purpose: to help customers get maximum value from Moz. Whether you’re a long-time customer looking to explore new features or you’re brand new to Moz and figuring out how to get started, our success team offers product webinars every week, as well as one-on-one product walkthroughs tailored to your needs, interests, and experience level.

The US members of our customer success team hone their skills at a local chocolate factory (Not pictured: our fantastic team members in the UK, Australia, and Dubai)

If you want to learn more about Moz Pro, check out a webinar or schedule a walkthrough.


Bonus! MozPod: Moz’s new free podcast made its debut

Okay, this really strays from product news, but another fun project that’s been gaining momentum is MozPod. This came about as a side passion project by our ever-ambitious head trainer. Lord knows that SEO and digital marketing are fast-moving and ever-changing; to help you keep up on hot topics and new developments, we’ve started the Mozpod. This podcast covers a range of topics, drawing from the brains of key folks in the industry. With topics ranging from structured data and app store optimization to machine learning and even blockchain, there’s always something interesting to learn about.

Join Brian every week for a new topic and guest:


What’s next?

We have a lot planned for 2018 — probably way too much. But one thing I can promise is that it won’t be a dull year. I prefer not to get too specific about projects that we’ve not yet started, but here are a few things already in the works:

  • A significant upgrade to our link data and toolset
  • On-demand Site Crawl
  • Added keyword research corpuses for the UK, Australia, and Canada
  • Expanded distribution channels for local to include Facebook, Waze, and Uber
  • More measurement and analytics features around local rankings, categories, & keywords
  • Verticalized solutions to address specific local search needs in the restaurant, hospitality, financial, legal, & medical sectors

On top of these and many other features we’re considering, we also plan to make it a lot easier for you to use our products. Right now, we know it can be a bit disjointed within and between products. We plan to change that.

We’ve also waited too long to solve for some specific needs of our agency customers. We’re prioritizing some key projects that’ll make their jobs easier and their relationships with Moz more valuable.


Thank you!

Before I go, I just want to thank you all for sharing your support, suggestions, and critical feedback. We strive to build the best SEO data and platform for our diverse and passionate customers. We could not succeed without you. If you’d like to be a part of making Moz a better platform, please let us know. We often reach out to customers and community members for feedback and insight, so if you’re the type who likes to participate in user research studies, customer interviews, beta tests, or surveys, please volunteer here.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Source: Moz