Tagged: Reputation Legal

The legal side of Reputation Management

New House Bill (Substitute FOSTA) Has More Promising Approach to Regulating Online Sex Trafficking 0

drudge sirenTomorrow, the House Judiciary Committee will markup the Allow States and Victims to Fight Online Sex Trafficking Act of 2017 (abbreviated to “FOSTA”). It appears a new substitute version of FOSTA will be marked up, not the bill as introduced. This makes a total of four bill versions: SESTA as introduced, SESTA as amended, FOSTA as introduced, and the new substitute FOSTA. This post recaps this complicated situation.

While I continue to believe that none of the bills are good policy, the substitute FOSTA version is better than SESTA as amended and FOSTA as introduced. Thus, compared to the existing options, I prefer substitute FOSTA. However, it needs some changes that I discuss below.

[Note: a few weeks ago, I testified at the House Energy & Commerce Committee against FOSTA as introduced. I didn’t get a chance to post my testimony here, so I encourage you to check it out as part of reviewing this post.]

Overview of the Substitute FOSTA

The substitute FOSTA has the following main provisions:

  • declarations that Section 230 wasn’t designed to facilitate online prostitution, and sites have been reckless about their impacts on sex trafficking victims.
  • a new crime, 18 USC 2421A, for intentionally promoting or facilitating prostitution.
  • criminal enhancements to 2421A if the defendant (1) promotes or facilitates the prostitution of 5+ victims, or (2) “acts in reckless disregard of the fact that such conduct contributed to sex trafficking in violation of” the federal anti-sex trafficking statute, 18 USC 1591(a).
  • a civil claim for violations of the 2421A enhancements.
  • mandatory restitution for criminal violations.
  • a Section 230 exclusion for state criminal laws that govern behavior that violates 2421A or 1591(a).

Comparing the Substitute FOSTA With SESTA as Amended and FOSTA as Introduced

Intent Standard for Criminality. Opponents of FOSTA/SESTA have regularly objected to online service liability for knowledge or recklessness towards third party content. These scienter standards set up a moderator’s dilemma: if services are liable for what they know about third party content, they may rationally choose to do less policing work as a way of reducing liability-creating knowledge. In contrast, the substitute FOSTA requires intent to promote or facilitate illegal activity (prostitution), which avoids the moderator’s dilemma.

Shifted Emphasis to Online Prostitution. Substitute FOSTA regulates online prostitution, a larger category of activity than sex trafficking, which is defined as compelled or underage prostitution. Prostitution is generally illegal in the US (and the bill contains a safe harbor for the limited circumstances where it’s legal), so the bill “narrowly” criminalizes the defendant’s intent to promote or facilitate illegal activity. However, because prostitution generates so much activity online, this law potentially affects the entire Internet community, including all “legitimate” online services.

Reduced Civil Exposure. Substitute FOSTA creates a new civil claim when a defendant intended to promote or facilitate prostitution AND either (a) involved 5+ victims, or (b) reckless disregarded how its conduct violated the federal anti-sex trafficking statute 1591(a). The civil claim remains subject to Section 230, and the draft has this unusual provision:

Consistent with section 230 of the Communications Act of 1934 (47 8 U.S.C. 230), a defendant may be held liable, under this subsection, where promotion or facilitation of prostitution activity includes responsibility for the creation or development of all or part of the information or content provided through any interactive computer service.

This language reinforces Section 230’s existing statutory definitions of “information content provider,” which says that a party is an information content provider for any content they create or develop in whole or in part. (I just did a lengthy blog post about the develop-in-part language). All of this is confusing because I’m not sure how the intent requirement intersects with Section 230, i.e., if the defendant has the requisite intent, isn’t that first-party liability? I’ll revisit this odd provision shortly.

National Standards. FOSTA and SESTA as introduced allowed for state law variations of online service regulation without any Section 230 limits. In particular, FOSTA as introduced would have authorized an infinite number of existing and new state crimes that could vary widely across jurisdictions. Substitute FOSTA bases liability only on the federal crime standards, creating a uniform national minimum standard for liability. (States could still pass idiosyncratic laws that add extra elements, but that would make things harder for law enforcement).

How the Substitute FOSTA Could Be Improved

Thus, there are several reasons to prefer substitute FOSTA over the alternatives: its scienter standard avoids the moderator’s dilemma, it sets a single national standard of liability rather than allowing for state law variations, and it directly links any civil liability to the federal crime. Substitute FOSTA can be further improved with these changes:

The 5+ Prostitutes Standard. The enhancement for 5+ prostitutes seems to collapse the lower crime with the enhancement. Any online service that publishes third party prostitution ads necessarily “promotes or facilitates the prostitution of 5 or more persons.” Once the threshold “intent” is established, online service defendants will routinely satisfy the enhancement.

This creates avoidable risk of authorizing dubious investigations. Consider how this could play out for giants like Google or Facebook. Despite their best efforts, surely both networks have some online prostitution activity. Let’s hypothesize that 0.01% of their site usage relates to online prostitution. Across a billion-member userbase, a 0.01% online prostitution usage converts to 100,000 users. So even if Google and Facebook get it 99.99% right, state and local prosecutors could still point to tens of thousands of online prostitution incidents as circumstantial evidence of the services’ “intent” to promote or facilitate online prostitution. And the statutory baseline of 5+ prostitutes frames the issue to make the online giants look like hotbeds of prostitution activity. So even if a prosecutor’s case will fail in court, substitute FOSTA would give state and local prosecutors a lot of juice to go after the Internet giants.

We can’t easily eliminate the risk of bogus state and local investigations due to substitute FOSTA, but we can blunt the 5+ language. Something like “promotes or facilitates the prostitution of 5 or more persons HIMSELF OR HERSELF (NOT CONSIDERING THE ACTS OR CONTENT OF ANY THIRD PARTIES).” This change would not treat third party promotions as part of the enhancement, and it would take some wind out of the sails of overeager prosecutors who can find many more than 5+ ads on a site.

I’d welcome other thoughts about how we might curb overzealous state and local investigations.

Section 230 Discussion. As mentioned, the civil claim says:

Consistent with section 230 of the Communications Act of 1934 (47 U.S.C. 230), a defendant may be held liable, under this subsection, where promotion or facilitation of prostitution activity includes responsibility for the creation or development of all or part of the information or content provided through any interactive computer service.

This language is more like a statement of intent than compulsory language. At minimum, it should be removed from the bill and put into legislative history.

As drafted, it seems to say that if an online service is responsible for ANY content on its site, it is responsible for ALL content, including all third party content. I don’t think that’s what was intended; and it’s certainly far beyond the law today. The Roommates.com case is quite clear that responsibility for third party content attaches only when the defendant creates or develops what made the content illegal. Thus, I would rephrase this:

A defendant may be held liable, under this subsection, WHEN THE DEFENDANT PROMOTES OR FACILITATES prostitution activity BY CREATING OR DEVELOPING all or part of WHAT MAKES the information or content provided through ITS interactive computer service ILLEGAL.

No Monitoring Obligation. To address the moderator’s dilemma in SESTA, I proposed a new Section 230 addition that emphasizes defendants wouldn’t be punished for trying to police anti-social content. Substitute FOSTA bypasses the moderator’s dilemma, so that provision is less urgent. However, reinforcing that monitoring and filtering is good, not bad, would still help with assertions of intent based on circumstantial evidence. So I reiterate my proposal for a new Section 230(g):

The fact that a provider or user of an interactive computer service has undertaken any efforts (including monitoring and filtering) to identify, restrict access to, or remove, material it considers objectionable shall not be considered in determining its liability for any material that it has not removed or restricted access to.

How Do I Feel About Substitute FOSTA? I continue to believe that Congress does not need to pass any bill: the SAVE Act did the work Congress wanted it to do; the Rentboy and MyRedbook prosecutions show the DOJ has effective legal tools (recall that both involved a prosecution for online prostitution, not sex trafficking, so they cover very similar ground to substitute FOSTA); Congress has other anti-sex trafficking initiatives in queue that may be more helpful; and it’s not empirically clear that efforts to extinguish online prostitution ads will protect victims. So here’s how I’d rank my priorities:

1) best outcome: no legislative changes.

2) second-best outcome (a distant second): substitute FOSTA due to the intent scienter, national legal standard and tight linkage between civil and state law enforcement claims and the federal crime.

3) third-best outcome (substantially behind substitute FOSTA): SESTA as amended, which fixed some of SESTA’s roughest edges but still retained its core imposition of the moderator’s dilemma.

4) fourth-best outcome: FOSTA as introduced. That version is probably already defunct. At least, I hope so.

What Happens Next

Will the Substitute FOSTA Pass the House? At the House E&C hearing, Rep. Wagner didn’t discuss the substitute and mostly spoke against SESTA as amended. This left me unclear where she stands regarding the substitute. However, I have received many indications that Rep. Wagner backs the substitute. As a result, I believe substitute FOSTA supplants FOSTA as introduced–bringing along the 170+ current co-sponsors of the bill.

Most Internet companies will prefer substitute FOSTA over other options. I expect tech advocacy groups will publicly line up behind substitute FOSTA.

However, “tech” support for substitute FOSTA will not be uniform. First, the Internet Association already endorsed SESTA as amended, so they are effectively blocked from embracing substitute FOSTA. Second, some so-called “tech” companies came out in favor of SESTA, and they may continue to support it, especially to score more brownie points with the sponsoring senators. For example, check out this tweet from IBM from Thursday (before substitute FOSTA got publicly posted):

ibm tweet

[It says :”IBM remains a strong supporter of . Our view: “ substitute bill” does not go nearly far enough to hold accountable those who enable criminal activity. We urge the House to adopt & quickly pass the Senate’s bill. “] I’m sure IBM did this tweet totally on its own initiative, without any prompting from Sen. Portman’s office, and based on its vast expertise in managing content online.

Victims groups may prefer SESTA as amended over substitute FOSTA. At minimum, the “I Am Jane Doe” crowd is on SESTA’s side and against substitute FOSTA (the Twitter chatter over the past few days isn’t voluminous, but it is revealing). At the House E&C Committee hearing, Rep. Blackburn supported FOSTA as introduced, and I’m not sure how she’ll feel about the substitute or if/when the House E&C committee will do its own markup.

So here’s how things look: if Reps. Goodlatte and Wagner are united, even if uneasily, and the Internet community favors or doesn’t oppose substitute FOSTA, there’s a significant chance that substitute FOSTA will pass the House (perhaps with some amendments).

House/Senate Conflicts. SESTA has 52 co-sponsors plus the support of the Internet Association; plus some victim advocacy groups will support SESTA 100%. The main thing slowing down SESTA is Sen. Wyden’s hold on the bill.

Sen. Wyden has softer words for substitute FOSTA over SESTA. On Friday, he said: “This is a smarter, more effective approach to go after criminal sex traffickers and prevent these heinous crimes. I’ve pushed to build on my record of giving law enforcement effective tools to put criminals behind bars, without undermining the foundations of the Internet.” In theory, he could place a hold on substitute FOSTA, but more likely he’ll use his SESTA hold to prioritize the Senate’s consideration of substitute FOSTA.

If SESTA passes the Senate (over Wyden’s hold) and substitute FOSTA passes the House, what happens then? One chamber could acquiesce to the other, the bills could be reconciled in conference (not an easy task, and goofy things could happen there), or there’s a very remote chance of paralysis that results in no bill passing Congress.

Post-Congressional Enactment. SESTA/FOSTA will not be Congress’ last review of Section 230. Congress already addressed online sex trafficking in the 2015 SAVE Act and has revisited the topic a scant 2 years later. Surely the advocates will be back a third time and more. So whatever “wins” the Internet community secures in the SESTA/FOSTA text may be short-lived.

Meanwhile, other victims advocacy groups will be targeting Section 230. We’ve repeatedly identified the terrorist victim group as the next logical advocates. The substitute FOSTA draft, with its high scienter bar, restricted civil claims, and national legal standard, will provide a better starting point for the inevitable clone-and-revise discussions.

Other Comments on Substitute FOSTA

R Street Institute

More SESTA-Related Posts:

* My testimony at the House Energy & Commerce Committee: Balancing Section 230 and Anti-Sex Trafficking Initiatives
How SESTA Undermines Section 230’s Good Samaritan Provisions
Manager’s Amendment for SESTA Slightly Improves a Still-Terrible Bill
Another Human Trafficking Expert Raises Concerns About SESTA (Guest Blog Post)
Another SESTA Linkwrap (Week of October 30)
Recent SESTA Developments (A Linkwrap)
Section 230’s Applicability to ‘Inconsistent’ State Laws (Guest Blog Post)
An Overview of Congress’ Pending Legislation on Sex Trafficking (Guest Blog Post)
The DOJ’s Busts of MyRedbook & Rentboy Show How Backpage Might Be Prosecuted (Guest Blog Post)
Problems With SESTA’s Retroactivity Provision (Guest Blog Post)
My Senate Testimony on SESTA + SESTA Hearing Linkwrap
Debunking Some Myths About Section 230 and Sex Trafficking (Guest Blog Post)
Congress Is About To Ruin Its Online Free Speech Masterpiece (Cross-Post)
Backpage Executives Must Face Money Laundering Charges Despite Section 230–People v. Ferrer
How Section 230 Helps Sex Trafficking Victims (and SESTA Would Hurt Them) (guest blog post)
Sen. Portman Says SESTA Doesn’t Affect the Good Samaritan Defense. He’s Wrong
Senate’s “Stop Enabling Sex Traffickers Act of 2017”–and Section 230’s Imminent Evisceration
The “Allow States and Victims to Fight Online Sex Trafficking Act of 2017” Bill Would Be Bad News for Section 230
WARNING: Draft “No Immunity for Sex Traffickers Online Act” Bill Poses Major Threat to Section 230
The Implications of Excluding State Crimes from 47 U.S.C. § 230’s Immunity

Not-Actually-the-Best Local SEO Practices 0

Posted by MiriamEllis

It’s never fun being the bearer of bad news.

You’re on the phone with an amazing prospect. Let’s say it’s a growing appliance sales and repair provider with 75 locations in the western US. Your agency would absolutely love to onboard this client, and the contact is telling you, with some pride, that they’re already ranking pretty well for about half of their locations.

With the right strategy, getting them the rest of the way there should be no problem at all.

But then you notice something, and your end of the phone conversation falls a little quiet as you click through from one of their Google My Business listings in Visalia to Streetview and see… not a commercial building, but a house. Uh-oh. In answer to your delicately worded question, you find out that 45 of this brand’s listings have been built around the private homes of their repairmen — an egregious violation of Google’s guidelines.

“I hate to tell you this…,” you clear your throat, and then you deliver the bad news.


If you do in-house Local SEO, do it for clients, or even just answer questions in a forum, you’ve surely had the unenviable (yet vital) task of telling someone they’re “doing it wrong,” frequently after they’ve invested considerable resources in creating a marketing structure that threatens to topple due to a crack in its foundation. Sometimes you can patch the crack, but sometimes, whole edifices of bad marketing have to be demolished before safe and secure new buildings can be erected.

Here are 5 of the commonest foundational marketing mistakes I’ve encountered over the years as a Local SEO consultant and forum participant. If you run into these in your own work, you’ll be doing someone a big favor by delivering “the bad news” as quickly as possible:

1. Creating GMB listings at ineligible addresses

What you’ll hear:

“We need to rank for these other towns, because we want customers there. Well, no, we don’t really have offices there. We have P.O. Boxes/virtual offices/our employees’ houses.”

Why it’s a problem:

Google’s guidelines state:

  • Make sure that your page is created at your actual, real-world location
  • PO Boxes or mailboxes located at remote locations are not acceptable.
  • Service-area businesses—businesses that serve customers at their locations—should have one page for the central office or location and designate a service area from that point.

All of this adds up to Google saying you shouldn’t create a listing for anything other than a real-world location, but it’s extremely common to see a) spammers simply creating tons of listings for non-existent locations, b) people of good will not knowing the guidelines and doing the same thing, and c) service area businesses (SABs) feeling they have to create fake-location listings because Google won’t rank them for their service cities otherwise.

In all three scenarios, the brand puts itself at risk for detection and listing removal. Google can catch them, competitors and consumers can catch them, and marketers can catch them. Once caught, any effort that was put into ranking and building reputation around a fake-location listing is wasted. Better to have devoted resources to risk-free marketing efforts that will add up to something real.

What to do about it:

Advise the SAB owner to self-report the problem to Google. I know this sounds risky, but Google My Business forum Top Contributor Joy Hawkins let me know that she’s never seen a case in which Google has punished a business that self-reported accidental spam. The owner will likely need to un-verify the spam listings (see how to do that here) and then Google will likely remove the ineligible listings, leaving only the eligible ones intact.

What about dyed-in-the-wool spammers who know the guidelines and are violating them regardless, turning local pack results into useless junk? Get to the spam listing in Google Maps, click the “Suggest an edit” link, toggle the toggle to “Yes,” and choose the radio button for spam. Google may or may not act on your suggestion. If not, and the spam is misleading to consumers, I think it’s always a good idea to report it to the Google My Business forum in hopes that a volunteer Top Contributor may escalate an egregious case to a Google staffer.

2. Sharing phone numbers between multiple entities

What you’ll hear:

“I run both my dog walking service and my karate classes out of my house, but I don’t want to have to pay for two different phone lines.”


“Our restaurant has 3 locations in the city now, but we want all the calls to go through one number for reservation purposes. It’s just easier.”


“There are seven doctors at our practice. Front desk handles all calls. We can’t expect the doctors to answer their calls personally.”

Why it’s a problem:

There are actually multiple issues at hand on this one. First of all, Google’s guidelines state:

  • Provide a phone number that connects to your individual business location as directly as possible, and provide one website that represents your individual business location.
  • Use a local phone number instead of a central, call center helpline number whenever possible.
  • The phone number must be under the direct control of the business.

This rules out having the phone number of a single location representing multiple locations.

Confusing to Google

Google has also been known in the past to phone businesses for verification purposes. Should a business answer “Jim’s Dog Walking” when a Google rep is calling to verify that the phone number is associated with “Jim’s Karate Lessons,” we’re in trouble. Shared phone numbers have also been suspected in the past of causing accidental merging of Google listings, though I’ve not seen a case of this in a couple of years.

Confusing for businesses

As for the multi-practitioner scenario, the reality is that some business models simply don’t allow for practitioners to answer their own phones. Calls for doctors, dentists, attorneys, etc. are traditionally routed through a front desk. This reality calls into question whether forward-facing listings should be built for these individuals at all. We’ll dive deeper into this topic below, in the section on multi-practitioner listings.

Confusing for the ecosystem

Beyond Google-related concerns, Moz Local’s awesome engineers have taught me some rather amazing things about the problems shared phone numbers can create for citation-building campaigns in the greater ecosystem. Many local business data platforms are highly dependent on unique phone numbers as a signal of entity uniqueness (the “P” in NAP is powerful!). So, for example, if you submit both Jim’s Dog Walking and Jim’s Bookkeeping to Infogroup with the same number, Infogroup may publish both listings, but leave the phone number fields blank! And without a phone number, a local business listing is pretty worthless.

It’s because of realities like these that a unique phone number for each entity is a requirement of the Moz Local product, and should be a prerequisite for any citation building campaign.

What to do about it:

Let the business owner know that a unique phone number for each business entity, each business location, and each forward-facing practitioner who wants to be listed is a necessary business expense (and, hey, likely tax deductible, too!). Once the investment has been made in the unique numbers, the work ahead involves editing all existing citations to reflect them. The free tool Moz Check Listing can help you instantly locate existing citations for the purpose of creating a spreadsheet that details the bad data, allowing you to start correcting it manually. Or, to save time, the business owner may wish to invest in a paid, automated citation correction product like Moz Local.

Pro tip: Apart from removing local business listing stumbling blocks, unique phone numbers have an added bonus in that they enable the benefits of associating KPIs like clicks-to-call to a given entity, and existing numbers can be ported into call tracking numbers for even further analysis of traffic and conversions. You just can’t enjoy these benefits if you lump multiple entities together under a single, shared number.

3. Keyword stuffing GMB listing names

What you’ll hear:

“I have 5 locations in Dallas. How are my customers supposed to find the right one unless I add the neighborhood name to the business name on the listings?”


“We want customers to know we do both acupuncture and massage, so we put both in the listing name.”


“Well, no, the business name doesn’t actually have a city name in it, but my competitors are adding city names to their GMB listings and they’re outranking me!”

Why it’s a problem:

Long story short, it’s a blatant violation of Google’s guidelines to put extraneous keywords in the business name field of a GMB listing. Google states:

  • Your name should reflect your business’ real-world name, as used consistently on your storefront, website, stationery, and as known to customers.
  • Including unnecessary information in your business name is not permitted, and could result in your listing being suspended.

What to do about it:

I consider this a genuine Local SEO toughie. On the one hand, Google’s lack of enforcement of these guidelines, and apparent lack of concern about the whole thing, makes it difficult to adequately alarm business owners about the risk of suspension. I’ve successfully reported keyword stuffing violations to Google and have had them act on my reports within 24 hours… only to have the spammy names reappear hours or days afterwards. If there’s a suspension of some kind going on here, I don’t see it.

Simultaneously, Google’s local algo apparently continues to be influenced by exact keyword matches. When a business owner sees competitors outranking him via outlawed practices which Google appears to ignore, the Local SEO may feel slightly idiotic urging guideline-compliance from his patch of shaky ground.

But, do it anyway. For two reasons:

  1. If you’re not teaching business owners about the importance of brand building at this point, you’re not really teaching marketing. Ask the owner, “Are you into building a lasting brand, or are you hoping to get by on tricks?” Smart owners (and their marketers) will see that it’s a more legitimate strategy to build a future based on earning permanent local brand recognition for Lincoln & Herndon, than for Springfield Car Accident Slip and Fall Personal Injury Lawyers Attorneys.
  2. I find it interesting that, in all of Google’s guidelines, the word “suspended” is used only a few times, and one of these rare instances relates to spamming the business title field. In other words, Google is using the strongest possible language to warn against this practice, and that makes me quite nervous about tying large chunks of reputation and rankings to a tactic against which Google has forewarned. I remember that companies were doing all kinds of risky things on the eve of the Panda and Penguin updates and they woke up to a changed webscape in which they were no longer winners. Because of this, I advocate alerting any business owner who is risking his livelihood to chancy shortcuts. Better to build things for real, for the long haul.

Fortunately, it only takes a few seconds to sign into a GMB account and remove extraneous keywords from a business name. If it needs to be done at scale for large multi-location enterprises across the major aggregators, Moz Local can get the job done. Will removing spammy keywords from the GMB listing title cause the business to move down in Google’s local rankings? It’s possible that they will, but at least they’ll be able to go forward building real stuff, with the moral authority to report rule-breaking competitors and keep at it until Google acts.

And tell owners not to worry about Google not being able to sort out a downtown location from an uptown one for consumers. Google’s ability to parse user proximity is getting better every day. Mobile-local packs prove this out. If one location is wrongly outranking another, chances are good the business needs to do an audit to discover weaknesses that are holding the more appropriate listing back. That’s real strategy – no tricks!

4. Creating a multi-site morass

What you’ll hear:

“So, to cover all 3 or our locations, we have greengrocerysandiego.com, greengrocerymonterey.com and greengrocerymendocino.com… but the problem is, the content on the three sites is kind of all the same. What should we do to make the sites different?”


“So, to cover all of our services, we have jimsappliancerepair.com, jimswashingmachinerepair.com, jimsdryerrepair.com, jimshotwaterheaterrepair.com, jimsrefrigeratorrepair.com. We’re about to buy jimsvacuumrepair.com … but the problem is, there’s not much content on any of these sites. It feels like management is getting out of hand.”

Why it’s a problem:

Definitely a frequent topic in SEO forums, the practice of relying on exact match domains (EMDs) proliferates because of Google’s historic bias in their favor. The ranking influence of EMDs has been the subject of a Google updateand has lessened over time. I wouldn’t want to try to rank for competitive terms with creditcards.com or insurance.com these days.

But if you believe EMDs no longer work in the local-organic world, read this post in which a fellow’s surname/domain name gets mixed up with a distant city name and he ends up ranking in the local packs for it! Chances are, you see weak EMDs ranking all the time for your local searches — more’s the pity. And, no doubt, this ranking boost is the driving force behind local business models continuing to purchase multiple keyword-oriented domains to represent branches of their company or the variety of services they offer. This approach is problematic for 3 chief reasons:

  1. It’s impractical. The majority of the forum threads I’ve encountered in which small-to-medium local businesses have ended up with two, or five, or ten domains invariably lead to the discovery that the websites are made up of either thin or duplicate content. Larger enterprises are often guilty of the same. What seemed like a great idea at first, buying up all those EMDs, turns into an unmanageable morass of web properties that no one has the time to keep updated, to write for, or to market.
  2. Specific to the multi-service business, it’s not a smart move to put single-location NAP on multiple websites. In other words, if your construction firm is located at 123 Main Street in Funky Town, but consumers and Google are finding that same physical address associated with fences.com, bathroomremodeling.com, decks.com, and kitchenremodeling.com, you are sowing confusion in the ecosystem. Which is the authoritative business associated with that address? Some business owners further compound problems by assuming they can then build separate sets of local business listings for each of these different service-oriented domains, violating Google’s guidelines, which state:

    Do not create more than one page for each location of your business.

    The whole thing can become a giant mess, instead of the clean, manageable simplicity of a single brand, tied to a single domain, with a single NAP signal.

  1. With rare-to-nonexistent exceptions, I consider EMDs to be missed opportunities for brand building. Imagine, if instead of being Whole Foods at WholeFoods.com, the natural foods giant had decided they needed to try to squeeze a ranking boost out of buying 400+ domains to represent the eventual number of locations they now operate. WholeFoodsDallas.com, WholeFoodsMississauga.com, etc? Such an approach would get out of hand very fast.

Even the smallest businesses should take cues from big commerce. Your brand is the magic password you want on every consumer’s lips, associated with every service you offer, in every location you open. As I recently suggested to a Moz community member, be proud to domain your flower shop as rossirovetti.com instead of hoping FloralDelivery24hoursSanFrancisco.com will boost your rankings. It’s authentic, easy to remember, looks trustworthy in the SERPs, and is ripe for memorable brand building.

What to do about it:

While I can’t speak to the minutiae of every single scenario, I’ve yet to be part of a discussion about multi-sites in the Local SEO community in which I didn’t advise consolidation. Basically, the business should choose a single, proud domain and, in most cases, 301 redirect the old sites to the main one, then work to get as many external links that pointed to the multi-sites to point to the chosen main site. This oldie but goodie from the Moz blog provides a further technical checklist from a company that saw a 40% increase in traffic after consolidating domains. I’d recommend that any business that is nervous about handling the tech aspects of consolidation in-house should hire a qualified SEO to help them through the process.

5. Creating ill-considered practitioner listings

What you’ll hear:

“We have 5 dentists at the practice, but one moved/retired last month and we don’t know what to do with the GMB listing for him.”


“Dr. Green is outranking the practice in the local results for some reason, and it’s really annoying.”

Why it’s a problem:

I’ve saved the most complex for last! Multi-practitioner listings can be a blessing, but they’re so often a bane that my position on creating them has evolved to a point where I only recommend building them in specific cases.

When Google first enabled practitioner listings (listings that represent each doctor, lawyer, dentist, or agent within a business) I saw them as a golden opportunity for a given practice to dominate local search results with its presence. However, Google’s subsequent unwillingness to simply remove practitioner duplicates, coupled with the rollout of the Possum update which filters out shared category/similar location listings, coupled with the number of instances I’ve seen in which practitioner listings end up outranking brand listings, has caused me to change my opinion of their benefits. I should also add that the business title field on practitioner listings is a hotbed of Google guideline violations — few business owners have ever read Google’s nitty gritty rules about how to name these types of listings.

In a nutshell, practitioner listings gone awry can result in a bunch of wrongly-named listings often clouded by duplicates that Google won’t remove, all competing for the same keywords. Not good!

What to do about it:

You’ll have multiple scenarios to address when offering advice about this topic.

1.) If the business is brand new, and there is no record of it on the Internet as of yet, then I would only recommend creating practitioner listings if it is necessary to point out an area of specialization. So, for example if a medical practice has 5 MDs, the listing for the practice covers that, with no added listings needed. But, if a medical practice has 5 MDs and an Otolaryngologist, it may be good marketing to give the specialist his own listing, because it has its own GMB category and won’t be competing with the practice for rankings. *However, read on to understand the challenges being undertaken any time a multi-practitioner listing is created.

2.) If the multi-practitioner business is not new, chances are very good that there are listings out there for present, past, and even deceased practitioners.

  • If a partner is current, be sure you point his listing at a landing page on the practice’s website, instead of at the homepage, see if you can differentiate categories, and do your utmost to optimize the practice’s own listing — the point here is to prevent practitioners from outranking the practice. What do I mean by optimization? Be sure the practice’s GMB listing is fully filled out, you’ve got amazing photos, you’re actively earning and responding to reviews, you’re publishing a Google Post at least once a week, and your citations across the web are consistent. These things should all strengthen the listing for the practice.
  • If a partner is no longer with the practice, it’s ideal to unverify the listing and ask Google to market it as moved to the practice — not to the practitioner’s new location. Sound goofy? Read Joy Hawkins’ smart explanation of this convoluted issue.
  • If, sadly, a practitioner has passed away, contact Google to show them an obituary so that the listing can be removed.
  • If a listing represents what is actually a solo practitioner (instead of a partner in a multi-practitioner business model) and his GMB listing is now competing with the listing for his business, you can ask Google to merge the two listings.

3.) If a business wants to create practitioner listings, and they feel up to the task of handling any ranking or situational management concerns, there is one final proviso I’d add. Google’s guidelines state that practitioners should be “directly contactable at the verified location during stated hours” in order to qualify for a GMB listing. I’ve always found this requirement rather vague. Contactable by phone? Contactable in person? Google doesn’t specify. Presumably, a real estate agent in a multi-practitioner agency might be directly contactable, but as my graphic above illustrates, we wouldn’t really expect the same public availability of a surgeon, right? Point being, it may only make marketing sense to create a practitioner listing for someone who needs to be directly available to the consumer public for the business to function. I consider this a genuine grey area in the guidelines, so think it through carefully before acting.

Giving good help

It’s genuinely an honor to advise owners and marketers who are strategizing for the success of local businesses. In our own small way, local SEO consultants live in the neighborhood Mister Rogers envisioned in which you could look for the helpers when confronted with trouble. Given the livelihoods dependent on local commerce, rescuing a company from a foundational marketing mistake is satisfying work for people who like to be “helpers,” and it carries a weight of responsibility.

I’ve worked in 3 different SEO forums over the past 10+ years, and I’d like to close with some things I’ve learned about helping:

  1. Learn to ask the right questions. Small nuances in business models and scenarios can necessitate completely different advice. Don’t be scared to come back with second and third rounds of follow-up queries if someone hasn’t provided sufficient detail for you to advise them well. Read all details thoroughly before replying.
  2. Always, always consult Google’s guidelines, and link to them in your answers. It’s absolutely amazing how few owners and marketers have ever encountered them. Local SEOs are volunteer liaisons between Google and businesses. That’s just the way things have worked out.
  3. Don’t say you’re sure unless you’re really sure. If a forum or client question necessitates a full audit to surface a useful answer, say so. Giving pat answers to complicated queries helps no one, and can actually hurt businesses by leaving them in limbo, losing money, for an even longer time.
  4. Network with colleagues when weird things come up. Ranking drops can be attributed to new Google updates, or bugs, or other factors you haven’t yet noticed but that a trusted peer may have encountered.
  5. Practice humility. 90% of what I know about Local SEO, I’ve learned from people coming to me with problems for which, at some point, I had to discover answers. Over time, the work put in builds up our store of ready knowledge, but we will never know it all, and that’s humbling in a very good way. Community members and clients are our teachers. Let’s be grateful for them, and treat them with respect.
  6. Finally, don’t stress about delivering “the bad news” when you see someone who is asking for help making a marketing mistake. In the long run, your honesty will be the best gift you could possibly have given.

Happy helping!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Source: Moz

Fourth Judge Says Social Media Sites Aren’t Liable for Supporting Terrorists–Pennie v. Twitter 0

[It’s impossible to blog about Section 230 without reminding you that it remains highly imperiled.]

This is one of the multitudinous 1-800-LAW-FIRM lawsuits against social media services for allegedly providing material support to terrorists. It has filed at least two new cases in the last few days. This particular case involves a 2016 shooting of five Dallas police officers by Micah Johnson, who allegedly was radicalized by online content from the terrorist group Hamas.

Three prior courts have rejected the suits, principally on Section 230 grounds but also with causation concerns. See Fields v. Twitter, Cohen v. Facebook, and Gonzalez v. Facebook. Like the others, the court dismisses the claims.

Lack of Causation

Of the four rulings, this opinion is probably the clearest and most emphatic about the lack of causation. The court says: “Plaintiffs do not meaningfully allege that Hamas itself carried out the attack, or even that it intended for such an attack to occur.” Instead, the complaint tries to bridge the gap by describing “contacts between African American and Palestinian organizations with no apparent relevance to this case.” Thus, the court summarizes, “Plaintiffs seek to [impose] liability for an attack by a person who had engaged on social media with groups that arguably shared an ideological affiliation with groups that received expressions of solidarity from groups that shared an ideological affiliation with a designated foreign terrorist organization to which Defendants provided support.” (Read that sentence a few times and you might conclude the judge meant to be a little tart).

This attenuation is too much for the judge:

The complaint here does not plausibly allege that Hamas “committed, planned, or authorized” the Dallas attack, or that it was “the person who committed” the attack, within any reasonable interpretation of those terms in 18 U.S.C. § 2333(d)….Without some meaningful connection between Hamas and the attack, Defendants ‘ alleged provision of support to Hamas does not meet even Plaintiffs‘ test of proximate cause: absent plausible allegations that Hamas itself was in some way a “substantial factor” in the attack, there is no basis to conclude that any support provided by Defendants to Hamas was a substantial factor.

Section 230

The court says: “the CDA immunizes Defendants from most if not all of Plaintiffs‘ claims, because Plaintiffs‘ theory of liability rests largely on the premise that Defendants should be held responsible for content created and posted by users (here, Hamas and its affiliates) of Defendants‘ interactive computer services.” The court breaks down some detail:

  • The court says the later-enacted federal anti-terrorism statute (JASTA) doesn’t trump or repeal Section 230.
  • The complaint alleged that it wasn’t basing its claims on Hamas’ content but instead on the defendants “allowing Hamas to use their services at all.” The court rightly calls BS on this: “Plaintiffs explicitly base their claims on the content that Hamas allegedly posts, because absent offending content, there would be no basis for even the frivolous causal connection that Plaintiffs have alleged between Defendants‘ services and the Dallas attack”
  • The complaint argued that the social media services should be liable for removing an account and then having the accountholder create a new account. The court rejects this based on Section 230(c)(2) and a cite to Roommates.com that defendants may edit some content without becoming liable for all content. This is the right result, but the judge doesn’t really walk through Section 230(c)(2)’s applicability in detail.
  • Ad targeting doesn’t constitute content development in part.

The assertion that YouTube shares its ad revenue gives the court some pause. The court says “whether, or under what circumstances, the CDA immunizes payments made by interactive service providers to content developers appears to be a novel issue.” Google cited Blumenthal v. Drudge, in which Section 230 applied even though AOL paid Matt Drudge for allegedly defamatory content, but the court distinguishes Blumenthal because it “does not address the question of whether the CDA immunizes payments that otherwise could themselves give rise to liability.” In other words, the legal question isn’t paying for illegal content; it’s the legality of putting any money into these specific pockets. As the court says, “Providing money to Matt Drudge generally is legal; providing money to Hamas generally is not.” Thus, “the Court declines to resolve the question of if or how the CDA applies where an interactive service provider shares advertising revenue with a content developer that has been designated as a foreign terrorist organization.” The court instead rests its dismissal of this point on lack of causation. We can expect the plaintiff’s counsel will revisit this point in its other cases.

What’s Next?

Some predictions:

  • 1-800-LAW-FIRM will keep filing new lawsuits despite its poor track record in court and its loss here.
  • In future cases, it will zero in on social media services’ payments to terrorists. This argument doesn’t help against Twitter or Facebook, but it might give a Section 230 bypass in YouTube’s case. However, I wonder if the plaintiffs can show that YouTube “knew” the accounts were held by terrorists who could not receive payment, and how many such accounts actually got any payments.
  • Other “material support for illegal activity” claims will be litigated in parallel, such as the Dyroff v. Experience Project case involving the online sale of illegal drugs. That ruling cited heavily to the existing “material support to terrorist” opinions, and favorable ruling in the non-terrorist cases will make it even harder for the plaintiffs to win the anti-terrorism cases.
  • Once SESTA (or one of the rival versions) passes, anti-terrorist groups will be asking Congress for a new exclusion to Section 230. While SESTA is a major concern to the online community, a follow-on Section 230 exclusion related to terrorist groups absolutely would be an existential battle over the Internet.

Last week, the Ninth Circuit heard oral arguments in the Fields v. Twitter case. I haven’t heard much about the hearing. Obviously, the Ninth Circuit’s decision has important implications for this entire line of cases.

Case citation: Pennie v. Twitter, Inc., 2017 WL 5992143 (N.D. Cal. Dec. 4, 2017). The complaint.

Source: Eric Goldman Legal

Mugshot Websites: How They Work, And How You Can Protect Yourself From a Career-Ending Embarrassment 0

Introduction After serving time for retail theft, Peter Gabiola had tried hard to turn over a new leaf. He finished a stint on parole and found a job. But a searing story in the Chicago Tribune shows how hard it can be to outrun the past, especially when that past is posted online. On the […]

The post Mugshot Websites: How They Work, And How You Can Protect Yourself From a Career-Ending Embarrassment appeared first on Defamation Removal Law.

Source: Aaron Minc

What Do Google’s New, Longer Snippets Mean for SEO? – Whiteboard Friday 0

Posted by randfish

Snippets and meta descriptions have brand-new character limits, and it’s a big change for Google and SEOs alike. Learn about what’s new, when it changed, and what it all means for SEO in this edition of Whiteboard Friday.

What do Google's now, longer snippets mean for SEO?

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about Google’s big change to the snippet length.

This is the display length of the snippet for any given result in the search results that Google provides. This is on both mobile and desktop. It sort of impacts the meta description, which is how many snippets are written. They’re taken from the meta description tag of the web page. Google essentially said just last week, “Hey, we have officially increased the length, the recommended length, and the display length of what we will show in the text snippet of standard organic results.”

So I’m illustrating that for you here. I did a search for “net neutrality bill,” something that’s on the minds of a lot of Americans right now. You can see here that this article from The Hill, which is a recent article — it was two days ago — has a much longer text snippet than what we would normally expect to find. In fact, I went ahead and counted this one and then showed it here.

So basically, at the old 165-character limit, which is what you would have seen prior to the middle of December on most every search result, occasionally Google would have a longer one for very specific kinds of search results, but more than 90%, according to data from SISTRIX, which put out a great report and I’ll link to it here, more than 90% of search snippets were 165 characters or less prior to the middle of November. Then Google added basically a few more lines.

So now, on mobile and desktop, instead of an average of two or three lines, we’re talking three, four, five, sometimes even six lines of text. So this snippet here is 266 characters that Google is displaying. The next result, from Save the Internet, is 273 characters. Again, this might be because Google sort of realized, “Hey, we almost got all of this in here. Let’s just carry it through to the end rather than showing the ellipsis.” But you can see that 165 characters would cut off right here. This one actually does a good job of displaying things.

So imagine a searcher is querying for something in your field and they’re just looking for a basic understanding of what it is. So they’ve never heard of net neutrality. They’re not sure what it is. So they can read here, “Net neutrality is the basic principle that prohibits internet service providers like AT&T, Comcast, and Verizon from speeding up, slowing down, or blocking any . . .” And that’s where it would cut off. Or that’s where it would have cut off in November.

Now, if I got a snippet like that, I need to visit the site. I’ve got to click through in order to learn more. That doesn’t tell me enough to give me the data to go through. Now, Google has tackled this before with things, like a featured snippet, that sit at the top of the search results, that are a more expansive short answer. But in this case, I can get the rest of it because now, as of mid-November, Google has lengthened this. So now I can get, “Any content, applications, or websites you want to use. Net neutrality is the way that the Internet has always worked.”

Now, you might quibble and say this is not a full, thorough understanding of what net neutrality is, and I agree. But for a lot of searchers, this is good enough. They don’t need to click any more. This extension from 165 to 275 or 273, in this case, has really done the trick.

What changed?

So this can have a bunch of changes to SEO too. So the change that happened here is that Google updated basically two things. One, they updated the snippet length, and two, they updated their guidelines around it.

So Google’s had historic guidelines that said, well, you want to keep your meta description tag between about 160 and 180 characters. I think that was the number. They’ve updated that to where they say there’s no official meta description recommended length. But on Twitter, Danny Sullivan said that he would probably not make that greater than 320 characters. In fact, we and other data providers, that collect a lot of search results, didn’t find many that extended beyond 300. So I think that’s a reasonable thing.


When did this happen? It was starting at about mid-November. November 22nd is when SISTRIX’s dataset starts to notice the increase, and it was over 50%. Now it’s sitting at about 51% of search results that have these longer snippets in at least 1 of the top 10 as of December 2nd.

Here’s the amazing thing, though — 51% of search results have at least one. Many of those, because they’re still pulling old meta descriptions or meta descriptions that SEO has optimized for the 165-character limit, are still very short. So if you’re the person in your search results, especially it’s holiday time right now, lots of ecommerce action, if you’re the person to go update your important pages right now, you might be able to get more real estate in the search results than any of your competitors in the SERPs because they’re not updating theirs.

How will this affect SEO?

So how is this going to really change SEO? Well, three things:

A. It changes how marketers should write and optimize the meta description.

We’re going to be writing a little bit differently because we have more space. We’re going to be trying to entice people to click, but we’re going to be very conscientious that we want to try and answer a lot of this in the search result itself, because if we can, there’s a good chance that Google will rank us higher, even if we’re actually sort of sacrificing clicks by helping the searcher get the answer they need in the search result.

B. It may impact click-through rate.

We’ll be looking at Jumpshot data over the next few months and year ahead. We think that there are two likely ways they could do it. Probably negatively, meaning fewer clicks on less complex queries. But conversely, possible it will get more clicks on some more complex queries, because people are more enticed by the longer description. Fingers crossed, that’s kind of what you want to do as a marketer.

C. It may lead to lower click-through rate further down in the search results.

If you think about the fact that this is taking up the real estate that was taken up by three results with two, as of a month ago, well, maybe people won’t scroll as far down. Maybe the ones that are higher up will in fact draw more of the clicks, and thus being further down on page one will have less value than it used to.

What should SEOs do?

What are things that you should do right now? Number one, make a priority list — you should probably already have this — of your most important landing pages by search traffic, the ones that receive the most search traffic on your website, organic search. Then I would go and reoptimize those meta descriptions for the longer limits.

Now, you can judge as you will. My advice would be go to the SERPs that are sending you the most traffic, that you’re ranking for the most. Go check out the limits. They’re probably between about 250 and 300, and you can optimize somewhere in there.

The second thing I would do is if you have internal processes or your CMS has rules around how long you can make a meta description tag, you’re going to have to update those probably from the old limit of somewhere in the 160 to 180 range to the new 230 to 320 range. It doesn’t look like many are smaller than 230 now, at least limit-wise, and it doesn’t look like anything is particularly longer than 320. So somewhere in there is where you’re going to want to stay.

Good luck with your new meta descriptions and with your new snippet optimization. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Source: Moz