Posts By: Curator
We recently released the 2017 Country RepTrak®, the study of the most reputable countries in the world. Canada, Switzerland and Sweden claimed the top 3 spots and all three countries have ranked in the top 5 of the study consistently over the last 6 years. This blog will explore what happened further down the list of top countries and the powerful political and economic factors influencing this year’s results.
What is the Country RepTrak® – and why does it matter?
The Country RepTrak® study is conducted annually by surveying over 39,000 consumers from G8 countries. The 55 countries included in the study are the largest economies in the world by GDP. Countries are evaluated in the areas of advanced economy, effective government and appealing environment. And as to why it matters, our data shows that country reputation has a direct impact on its economy. More specifically, an increase in a country’s RepTrak® Pulse score by 1 point in a market segment results in an average increase in 2.7% in the arrival of tourists from that market and an average growth of 1.7% in exports to that market.
US, UK and Russia Take Hits to Their Reputations in 2017
After President Trump took office, the reputation of the United States dropped by 8.1% or 11 positions in the ranking. The aspects where the United States saw the greatest declines were effective government, social welfare, ethical country with high transparency, and responsible participant in the global community. The US scores particularly decreased across the board from respondents in Mexico, with Trump’s antagonistic message towards the neighboring country hitting home. Interestingly, there was an increased feeling of empathy within the US population towards Mexico.
Russia experienced the second highest drop in rankings related its growing role in global policy, which has generated suspicions among international observers. The biggest decrease in ratings were in the areas of ethical country with high transparency and responsible participant in global community.
The reputation of the United Kingdom fell by 3.8% or lost 5 positions in the ranking as a result of Brexit, but interesting to note, the internal reputation of the UK – how the country views itself – is at a record high. Brazil has the opposite situation – the exit from negative headlines related to the Olympics and World Cup has allowed the country’s reputation among the international community to recover but its internal reputation and its reputation within Latin America remains low.
Increased Economic Stability Aided Greece, Spain and Portugal
Greece experienced the highest improvement in reputation with an increase in Pulse score of 14.3% and a shift upwards of 7 positions. This is due to the reduced critical headlines in the international economic press.
Spain and Portugal continue to improve their reputations as their economies gradually recover. Spain’s Pulse score improved by 5.2% and climbed 4 positions in the rankings and Portugal’s Pulse score increased 3.1% and climbed 2 positions in the ranking.
2017 Most Reputable Countries – Rankings & Scores
More on country reputation:
- The World’s Most Reputable Countries In 2017: The U.S. Feels The Trump Effect – Forbes
- What gives a nation a great reputation? – Policy Options
- Canada Named ‘Most Reputable Country’ In Time For 150th Birthday – Huffington Post
Your online reputation is just as, if not more, important than your offline reputation. What goes on the internet about you or your business can stay there until the internet stops working. Although this may […]
Source: Search Reputation
Google (Almost Certainly) Has an Organic Quality Score (Or Something a Lot Like It) that SEOs Need to Optimize For – Whiteboard Friday
Posted by randfish
Entertain the idea, for a moment, that Google assigned a quality score to organic search results. Say it was based off of click data and engagement metrics, and that it would function in a similar way to the Google AdWords quality score. How exactly might such a score work, what would it be based off of, and how could you optimize for it?
While there’s no hard proof it exists, the organic quality score is a concept that’s been pondered by many SEOs over the years. In today’s Whiteboard Friday, Rand examines this theory inside and out, then offers some advice on how one might boost such a score.
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about organic quality score.
So this is a concept. This is not a real thing that we know Google definitely has. But there’s this concept that SEOs have been feeling for a long time, that similar to what Google has in their AdWords program with a paid quality score, where a page has a certain score assigned to it, that on the organic side Google almost definitely has something similar. I’ll give you an example of how that might work.
So, for example, if on my site.com I have these three — this is a very simplistic website — but I have these three subfolders: Products, Blog, and About. I might have a page in my products, 14axq.html, and it has certain metrics that Google associates with it through activity that they’ve seen from browser data, from clickstream data, from search data, and from visit data from the searches and bounces back to the search results, and all these kinds of things, all the engagement and click data that we’ve been talking about a lot this year on Whiteboard Friday.
So they may have these metrics, pogo stick rate and bounce rate and a deep click rate (the rate with which someone clicks to the site and then goes further in from that page), the time that they spend on the site on average, the direct navigations that people make to it each month through their browsers, the search impressions and search clicks, perhaps a bunch of other statistics, like whether people search directly for this URL, whether they perform branded searches. What rate do unique devices in one area versus another area do this with? Is there a bias based on geography or device type or personalization or all these kinds of things?
But regardless of that, you get this idea that Google has this sort of sense of how the page performs in their search results. That might be very different across different pages and obviously very different across different sites. So maybe this blog post over here on /blog is doing much, much better in all these metrics and has a much higher quality score as a result.
Current SEO theories about organic quality scoring:
Now, when we talk to SEOs, and I spend a lot of time talking to my fellow SEOs about theories around this, a few things emerge. I think most folks are generally of the opinion that if there is something like an organic quality score…
1. It is probably based on this type of data — queries, clicks, engagements, visit data of some kind.
We don’t doubt for a minute that Google has much more sophistication than the super-simplified stuff that I’m showing you here. I think Google publicly denies a lot of single types of metric like, “No, we don’t use time on site. Time on site could be very variable, and sometimes low time on site is actually a good thing.” Fine. But there’s something in there, right? They use some more sophisticated format of that.
2. We also are pretty sure that this is applying on three different levels:
This is an observation from experimentation as well as from Google statements which is…
- Domain-wide, so that would be across one domain, if there are many pages with high quality scores, Google might view that domain differently from a domain with a variety of quality scores on it or one with generally low ones.
- Same thing for a subdomain. So it could be that a subdomain is looked at differently than the main domain, or that two different subdomains may be viewed differently. If content appears to have high quality scores on this one, but not on this one, Google might generally not pass all the ranking signals or give the same weight to the quality scores over here or to the subdomain over here.
- Same thing is true with subfolders, although to a lesser extent. In fact, this is kind of in descending order. So you can generally surmise that Google will pass these more across subfolders than they will across subdomains and more across subdomains than across root domains.
3. A higher density of good scores to bad ones can mean a bunch of good things:
- More rankings in visibility even without other signals. So even if a page is sort of lacking in these other quality signals, if it is in this blog section, this blog section tends to have high quality scores for all the pages, Google might give that page an opportunity to rank well that it wouldn’t ordinarily for a page with those ranking signals in another subfolder or on another subdomain or on another website entirely.
- Some sort of what we might call “benefit of the doubt”-type of boost, even for new pages. So a new page is produced. It doesn’t yet have any quality signals associated with it, but it does particularly well.
As an example, within a few minutes of this Whiteboard Friday being published on Moz’s website, which is usually late Thursday night or very early Friday morning, at least Pacific time, I will bet that you can search for “Google organic quality score” or even just “organic quality score” in Google’s engine, and this Whiteboard Friday will perform very well. One of the reasons that probably is, is because many other Whiteboard Friday videos, which are in this same subfolder, Google has seen them perform very well in the search results. They have whatever you want to call it — great metrics, a high organic quality score — and because of that, this Whiteboard Friday that you’re watching right now, the URL that you see in the bar up above is almost definitely going to be ranking well, possibly in that number one position, even though it’s brand new. It hasn’t yet earned the quality signals, but Google assumes, it gives it the benefit of the doubt because of where it is.
- We surmise that there’s also more value that gets passed from links, both internal and external, from pages with high quality scores. That is right now a guess, but something we hope to validate more, because we’ve seen some signs and some testing that that’s the case.
3 ways to boost your organic quality score
If this is true — and it’s up to you whether you want to believe that it is or not — even if you don’t believe it, you’ve almost certainly seen signs that something like it’s going on. I would urge you to do these three things to boost your organic quality score or whatever you believe is causing these same elements.
1. You could add more high-performing pages. So if you know that pages perform well and you know what those look like versus ones that perform poorly, you can make more good ones.
2. You can improve the quality score of existing pages. So if this one is kind of low, you’re seeing that these engagement and use metrics, the SERP click-through rate metrics, the bounce rate metrics from organic search visits, all of these don’t look so good in comparison to your other stuff, you can boost it, improve the content, improve the navigation, improve the usability and the user experience of the page, the load time, the visuals, whatever you’ve got there to hold searchers’ attention longer, to keep them engaged, and to make sure that you’re solving their problem. When you do that, you will get higher quality scores.
3. Remove low-performing pages through a variety of means. You could take a low-performing page and you might say, “Hey, I’m going to redirect that to this other page, which does a better job answering the query anyway.” Or, “Hey, I’m going to 404 that page. I don’t need it anymore. In fact, no one needs it anymore.” Or, “I’m going to no index it. Some people may need it, maybe the ones who are visitors to my website, who need it for some particular direct navigation purpose or internal purpose. But Google doesn’t need to see it. Searchers don’t need it. I’m going to use the no index, either in the meta robots tag or in the robots.txt file.”
One thing that’s really interesting to note is we’ve seen a bunch of case studies, especially since MozCon, when Britney Muller, Moz’s Head of SEO, shared the fact that she had done some great testing around removing tens of thousands of low-quality, really low-quality performing pages from Moz’s own website and seen our rankings and our traffic for the remainder of our content go up quite significantly, even controlling for seasonality and other things.
That was pretty exciting. When we shared that, we got a bunch of other people from the audience and on Twitter saying, “I did the same thing. When I removed low-performing pages, the rest of my site performed better,” which really strongly suggests that there’s something like a system in this fashion that works in this way.
So I’d urge you to go look at your metrics, go find pages that are not performing well, see what you can do about improving them or removing them, see what you can do about adding new ones that are high organic quality score, and let me know your thoughts on this in the comments.
We’ll look forward to seeing you again next week for another edition of Whiteboard Friday. Take care.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
By Summer Williams
Summer Williams grew up in Weatherford, Texas and joined the Main Street Hub Team as a Community Manager in February 2017. After spending eight years as a humor columnist for Houston Community News, Summer is excited to use her writing skills to help local businesses — like her family’s flower shop.
Greene’s Florist is what some would call an institution in a small town. It’s one of the longest running businesses in Weatherford, Texas — it was opened in 1957 by my great-grandmother, Wilma Greene, and then, taken over in 1965 by my grandfather, Bobby Greene. After his passing in 1998, my mother, Donti Greene Dennis, and aunt, Erika Greene Forrest, along with my grandmother, Harlene Greene, took over ownership of the shop. Not that they hadn’t been there already; my mother started working there as a teenager, and my aunt, in her early twenties.
All three still run the business today — my mother as a bookkeeper and designer, my aunt as an office manager and designer, and my nana still helps deliver flowers twice a week. The shop is beloved, not just by our family but by our town — everyone knows at least one of the Greenes personally and never has an ill word to say about them.
My sister and I did a lot of growing up there too. It’s where we got our cheeks pinched, learned how to cut flowers and take orders, and later even learned to drive using the delivery vans (sorry for all the scares, Nana!). Ironically, I never actually learned how to design arrangements. I blame text messaging since it became so popular my sophomore year of high school.
Four generations of my family have worked at Greene’s. When I smell flowers, I don’t just smell their sweet aroma — I smell my home. I’m reminded of my family and all the years that I spent hanging out around the shop and delivering flowers with my grandmother. There is so much love and history in that building — you can feel it when you walk in — and that’s what they share with their customers every day.
There’s nothing like going into a small business and knowing exactly who will be waiting behind the door to help you, and I think that’s what local is all about. Those are the people who will go out of their way for you, who remember what you like and what you don’t, and who empathize with you whenever anything good or bad happens. Greene’s Florist does this in every way and has since the very beginning.
Supporting local isn’t just a way to put money back into your city — it’s a way to say thank you.
Learn more about our team’s Reverence for Local Business:
How Local Business Helped Me Discover Community by Austin King
My Love for Local: A Sheep at the Wheel Yarn Co. by Kayla Moses
Source: Main Street Hub
[It’s impossible to blog about Section 230 without reminding you that Congress is on the cusp of gutting it.]
The principal plaintiff, performer Mikel Knight, was the subject of critical Facebook posts related to two fatal accidents by his tour buses. Knight demanded Facebook remove the posts. Facebook refused. Knight sued. Facebook brought an anti-SLAPP motion predicated on Section 230. Sounds like an easy win for Facebook…right?
Yes, except somehow the trial court fouled this up. The court granted the Section 230-based anti-SLAPP motion for some of the claims but not for the publicity rights-related claims–despite the fact that Facebook merely ran ads on third party content, and without any reference to the ccBill case saying Section 230 preempted publicity rights claims in the Ninth Circuit (or the Caraccioli v. Facebook case, confirming that result). The trial court’s ruling prompted my first meltdown of 2016 about the state of Section 230 jurisprudence, when I asked “WTF Is Going On With Section 230?”
Fortunately, the appellate court fixes this mess. The court concludes that all of the claims against Facebook are covered by California’s anti-SLAPP law. While the court expressly sidesteps Section 230’s applicability to publicity rights claims, this is still a satisfying denouement to a bad ruling.
The opinion starts by considering if the case involved an issue of public interest. The posts “involved the danger of trucks on highways driven by sleep-deprived drivers,” which is clearly a matter of public interest. To get around this, the plaintiffs tried an unusual and tricky argument: they said they were suing over Facebook’s privately communicated promises to remove the content, and that communication wasn’t a matter of public interest. The court says this argument was belied by the plaintiffs’ filings, which focused on the Facebook pages’ content. I would add that an Internet giant’s decision to remove user content could be a matter of public interest based on the censorious implications, but the court didn’t need to go that far. The “commercial speech” exception to the anti-SLAPP law also did not apply.
Turning to the plaintiffs’ showing of a likelihood of prevailing on the merits, the court has little trouble concluding that Section 230 preempts the breach of contract, negligent misrepresentation, and negligent interference claims. The plaintiffs once again tried to argue that they were suing based on Facebook’s removal promises (including, apparently, the negative behavioral covenants in its Statement of Rights and Responsibilities), and once again it didn’t work. The court responds that “numerous courts have held the CDA bars claims based on a failure to remove content posted by others” (citing Hupp v. Freedom Communications, Doe II v. MySpace, Gentry v. eBay, Caraccioli v. Facebook, Klayman v. Zuckerberg and Sikhs for Justice v. Facebook).
Finally, the court turns to the publicity rights claims. The court says these claims aren’t meritorious because Facebook does not “use” the plaintiff’s identity:
The gravamen of Knight‘s complaint is that Facebook displayed unrelated ads from Facebook advertisers adjacent to the content that allegedly used Knight‘s name and likeness—content, Knight concedes, created by third-party users. He has not, and cannot, offer any evidence that Facebook used his name or likeness in any way.
Publicity rights law is a doctrinal mess. Courts routinely struggle with how to apply publicity rights laws to ad-supported editorial content that references or depicts a plaintiff. The appellate court got to the right place, but I don’t have much faith that future courts will do the same.
Let’s not forget how bad of a ruling this is for the plaintiffs. Facebook’s anti-SLAPP win means the plaintiffs are on the hook for Facebook’s attorneys’ fees, which aren’t going to be cheap (especially given that the appellate costs are included).
Note: On appeal, I joined the EFF’s amicus brief in favor of Facebook.
Case citation: Cross v. Facebook, Inc., 2017 WL 3404767 (Cal. App. Ct. Aug. 9, 2017)
Source: Eric Goldman Legal