Friday, September 29, 2017

SEO ranking factors: What’s important, what’s not

This week, Google celebrated its 19th birthday. A lot has changed in nearly two decades. Rather than relying primarily on PageRank to evaluate the quality of web pages, Google now uses a whole array of techniques to suggest a wide range of content in response to queries, from simple direct answers to multimedia audio and video files.

With loads of guesswork and assumptions, the debate about Google ranking factors is never-ending and evolves with every algorithm update. What’s on the rise, what’s on the decline, and what still works?

At SMX East, several sessions look closely at today’s most important ranking considerations. In SEO Ranking Factors In 2017: What’s Important, What’s Not, you’ll hear the results of comprehensive studies undertaken by Searchmetrics and SEMRush, which looked at millions of sites to determine what separated winners from losers. You’ll also hear a case study from Herndon Hasty, SEO for the Container Store, which battles with formidable competition from Amazon, Walmart and other e-commerce giants.

Shari Thurow has been practicing SEO and carefully observing Google since its inception. In her always popular Search Engine-Friendly Web Design session, you’ll learn how to create search engine-friendly sites that are equally appealing to human visitors. And you’ll get juicy insights into critical aspects of SEO, including:

  • Wayfinder sitemaps vs. XML sitemaps
  • guidelines for mobile-friendly URL structure
  • mobile readability tools, techniques and guidelines
  • parallax design & mobile UX: Dos & don’ts

And if you have questions about particular strategies or techniques, be sure to attend the Meet The SEOs session. During this PowerPoint-free panel, veteran SEOs answer your questions about search engine optimization. Got a puzzling issue? Wondering about a possible trend? Put it to the experts.

Register for SMX East today!

SMX East is just a month away, and the best preconference rate is still available. So be sure to register for SMX East now.

The post SEO ranking factors: What’s important, what’s not appeared first on Search Engine Land.



from SEO Rank Video Blog http://ift.tt/2fFjLPj
via IFTTT

SearchCap: Bing Ads tracking, Google Home screen & Google Fred

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

Local & Maps

Link Building

Searching

SEO

SEM / Paid Search

Search Marketing

The post SearchCap: Bing Ads tracking, Google Home screen & Google Fred appeared first on Search Engine Land.



from SEO Rank Video Blog http://ift.tt/2yefauW
via IFTTT

Google’s ‘Manhattan project’: Home device with a screen to compete with Echo Show

Google generally doesn’t do as well when it builds “follower” products — think Google Plus or Allo. But there are other examples where Google has excelled with later entries (e.g., AdWords, Maps). Right now, Google Home is a follower product seeking to break out of Amazon Echo’s shadow.

On paper, Google should win in this market. It has a larger developer ecosystem. And it has a better assistant. But Amazon is being very aggressive by innovating quickly and offering a dizzying array of devices at different price points. Amazon also has a more powerful sales channel. Overall, Amazon is out-innovating the rest of the “smart speaker” market at the moment.

Amazon now has two devices with screens: Echo Show and the new Echo Spot. According to TechCrunch, Google is also working on a Home device with a touchscreen:

Two sources confirm to TechCrunch that the Google device has been internally codenamed “Manhattan” and will have a similar screen size to the 7-inch Echo Show. One source received info directly from a Google employee. Both sources say the device will offer YouTube, Google Assistant, Google Photos and video calling. It will also act as a smart hub that can control Nest and other smart home devices.

A Home with a touchscreen could run Android apps and offer a stronger screen experience than the sub-optimal Echo Show. It would also enable video calling and be compatible with entertainment services such as Netflix.

Echo Show, right now, doesn’t fully utilize the screen and creates consumer expectations it doesn’t fulfill. An Echo Show 2.0 will likely be an improvement. (I haven’t been hands-on with the new Echo Spot.)

Apple is also well-positioned to offer a smart speaker with a screen — like an iPad Mini embedded in a speaker. It’s not clear whether the company will develop one. Both Amazon and Google are trying to preempt Apple’s HomePod by bringing out smart speakers with better sound that cost less than the $349 price tag Apple wants to charge.

YouTube will be something of a differentiator for Google’s new device. It has withdrawn from Echo Show, allegedly for violating Google’s terms of service.

It remains to be seen how popular touchscreen-enabled virtual assistants are, although preliminary survey data suggests there’s meaningful consumer interest. Regardless, there will likely be in excess of 30 million virtual assistant devices in US households when the smoke clears after holiday shopping is over. You can bet that Amazon will be aggressively promoting its own devices with discounts on its site and mobile apps.

Consumer data also suggests that virtual assistant devices are driving related smart home accessory purchases. The company that wins the smart speaker market will likely also control the smart home ecosystem.

The post Google’s ‘Manhattan project’: Home device with a screen to compete with Echo Show appeared first on Search Engine Land.



from SEO Rank Video Blog http://ift.tt/2yMW8ZP
via IFTTT

Targeting featured snippet and ‘People also ask’ SERP features

Targeting Featured Snippets and People Also Ask SERP Features

Search engines have a peculiar business model: They exist to quickly direct you somewhere else. This is in direct contrast to your typical web business or social platform, where they do everything they can to keep you engaged and on that platform.

This can’t have escaped the notice of the good folks at Google. And now, many questions are answered directly on the search engine. This keeps you on the page a little longer and (I would imagine) ups the likelihood of your conducting another search or — shock, horror! — even clicking on a search ad.

You have probably seen this a million times, but the following searches should all provide some form of answer directly in the search results.

  • “What is my IP?”
  • “Calculator”
  • “What is the square root of 196?”
  • “Telephone number for Bowler Hat SEO”

For these kinds of queries, there is no longer a need to actually visit a third-party website — even when they are directly referencing a business, as in the telephone query example.

telephone number for bowler hat seo

We get answers directly in the search results now, which is often super-helpful for us users.

Featured snippets

One particular SERP feature that we are seeing more commonly is known as a featured snippet (or answer box).

A featured snippet is a summarized answer to the user’s search query that typically appears at the top of the search results. The snippet will include a brief answer to the question, a linked page title and the URL of the page.

Here is a featured snippet for the question, “What is a featured snippet?”

what is a featured snippet

We have been tinkering with some of the posts over on the Bowler Hat blog and have managed to generate featured snippets for a number of them. This is great positioning and is often referred to as “position zero,” as it sits above the standard results with a supersized listing.

As an example, we have a post that provides a set of small business SEO tips, which tends to hover around third or fourth for a variety of search terms. With a featured snippet, we now have visibility above the organic results and within the results themselves. Win-win.

small business seo tips

This is great additional exposure. Even though I am not super-keen on the text they are using in this example, from an organic search perspective, what’s not to like?

There are a couple of different forms that featured snippets can take, from the most popular paragraph form to tables to bulleted lists. We have seen bulleted lists taken from content in a <ol> tag as well as from header tags — which just reinforces the need for well-structured HTML.

Featured snippet placement can be hugely powerful from an SEO and marketing standpoint:

  • More SERP real estate
  • More clicks overall*
  • Increased awareness and branding

* It’s of note that, in our experience, the featured snippets don’t tend to get a huge amount of click-throughs, and they reduce the click-through on the organic listings slightly. So, while it may not set the world on fire, clicks on your snippet and organic listing combined should increase compared to a listing alone, and the exposure itself is going to be highly valuable. And, of course, not all featured snippets are created equal — for the “small business SEO tips” example above, the snippet does not answer the question, so you have to click through to get the goodies.

Yet, there is another side to this coin: There is only one featured snippet, and only one company can have it. So, what impact does a featured snippet have if you are not the chosen one?

There are a few studies out there that would indicate that a featured snippet does reduce the number of clicks on a first-page listing. It would seem that a typical #1 listing does around 25 percent of clicks, where a #1 listing with a featured snippet above does about 20 percent of clicks.

That’s a notable impact for sure, but we have seen far worse implications in the wild with clients we currently work with at my agency. One client site saw traffic impacted by over 50 percent where a featured snippet has appeared above their #1 ranking. So, these averages are not always useful, and you have to monitor the impact of SERP features like featured snippets for terms you are targeting. For this specific client, that snippet has now disappeared — so a calm head is also needed as these new SERP features mature.

In this case, if a featured snippet appears, your rank tracker may tell you that you are still in position #1, yet traffic has dropped. So ensuring you understand the SERP features is key here.

(We like the BrightLocal rank tracker for this, as it keeps screen shots of each rank report. This is a great help when doing historical analysis of rankings and traffic so we can see what the actual page layout looked like at any given point in time.)

People also ask

Another feature that tends to crop up along with featured snippets is “People also ask” boxes. These are sets of questions that relate to the original search query.

“People also ask” boxes are an interesting SERP feature in that they are dynamic. When you click on any one of the questions, specific details are revealed and further questions are added to the bottom of the list.

The following image shows both a featured snippet and a “People also ask” box.

how much does an app cost

So, if we include the ad links (five with the sitelinks), the featured snippet and the “People also ask” links, our traditional #1 organic listing is the 11th link on the page (jeez). Throw a few more ads into the picture, and that is a lot of links for a user to wade through before they get to a traditional organic result — often with the answer already on the page.

How far down the rabbit hole do you want to go?

As mentioned above, when a user clicks on a “People also ask” question, we see the question itself expand to take up more screen space, and we get an additional two or three questions added to the bottom of the list.

This process repeats itself for each question clicked on. There is seemingly no limit to this, and each click pushes the traditional organic results further down the page.

Here, we see the initial four questions expanded to six questions, with the answer to the first question also revealed.

how much does it cost to develop a mobile app

And it just keeps on going and going and going! It really can spiral, and it is almost like conducting new search queries in relation to the questions you answer right there amidst another set of search results. Wild!

After 10 clicks, we have 10 expanded questions, each about the size of two standard organic listings, and 14 further questions below. This occupies about four total screen sizes of scrolling on a typical desktop before you get to an organic result. This is not intended to be a realistic example of search engine usage, yet it is still a little scary if you rely on organic clicks and don’t have featured snippets.

Featured snippets = People also ask?

In the majority of cases, Googling the questions from the “People also ask” results will return a featured snippet. So, if we Google the expanded question above, “Is the Uber app free?” we get the same piece of content as a featured snippet.

So it is almost as if the “People also ask” results are related to featured snippets.

Another interesting fact here, taken from the recent Ahrefs study on featured snippets, is that content can rank for many featured snippets. In fact, the top-performing page in the Ahrefs database had 4,658 featured snippets… for a single page.

Taking a look at this page and the site itself, which also has a huge number of featured snippets, the writing style is certainly interesting: Short, practical sentences. Paragraphs are, in fact, often just one sentence. It makes for easy reading and (it would seem) for easy digestion by search engine algorithms.

If you are using content marketing as a part of your SEO (and you really should be), then you should also be looking to target these new SERP features to improve your visibility and traffic from organic search.

SEO for featured snippets

Fortunately for us lucky campers, there have been a few studies done to identify the patterns here and provide guidance on optimizing your content for featured snippets.

The major takeaways here to optimize your content for featured snippets are as follows:

  1. Ensure your content already ranks well for the targeted search query — ideally, in the top five results and most certainly on the first page of results.
  2. Have the best answer, and summarize the question and answer in a way that matches the current featured snippet. This is a real opportunity if you are not the first, as you can piggyback those stronger results with better content (which is the way it should be).
  3. Ensure your content matches the kind of featured snippet that is showing for a given query — if you are targeting the paragraph format, have a paragraph of roughly 40 to 50 words that includes the question and summarized answer. If you are targeting a list or table, have your content in a list or table (ideally with some form of incentive or CTA to get the user to click and read the full article as well).
  4. Don’t be afraid to experiment. Playing with the content and using the “Fetch as Google” feature in Google Search Console can show almost instant changes to the content in the answer box/featured snippet. You can also see this impact the results where a site has a featured snippet but you also rank highly. Experiment.

Fortunately, this is not terribly technical. There are no guarantees, and it requires an analysis of what the featured snippets that you are targeting look like, but with some small tweaks, you can generate big results.

Don’t forget the SEO basics

Remember that to get featured snippets, you must already rank well. So, whether you are a small business that needs to do the SEO basics, you’re focusing on SEO and content marketing, or you need to build links and authority — until you rank in the top half of the page, getting featured snippets is the least of your worries. Also, it’s worth noting that if you primarily target local terms, featured snippets don’t show along with a local pack — so this is something you don’t currently need to worry about.

That’s a wrap…

What is your experience with featured snippets? Are you getting that highly desirable position zero? Or are you struggling to get your content to feature? I would love to connect on Twitter and LinkedIn and hear how you are getting on targeting some of these new and exciting SERP features.

The post Targeting featured snippet and ‘People also ask’ SERP features appeared first on Search Engine Land.



from SEO Rank Video Blog http://ift.tt/2yMn7oc
via IFTTT

The trouble with ‘Fred’

Disclaimer: All criticism of Google spokespeople contained herein is impersonal in nature. I know they are only representing the internal direction of the company and not acting independently. They do strive to be as helpful as they can.

When former head of web spam Matt Cutts was at Google, he spent a lot of time communicating with webmasters/site owners about updates. We knew what was coming, when it might be coming, and how severe it would possibly be.

If you woke up in the morning and your traffic had fallen off a proverbial cliff, you could go to Twitter and, based on what Cutts was posting, usually determine if Google had run an update. You could even tell how severe the rollout was, as Cutts would typically give you percentage of queries affected.

Although some believe Cutts was more about misinformation than information, when it came to updates, most would agree he was on point.

So if a site fell off that cliff, you could learn from Cutts what happened, what the update was named, and what it affected. This gave you starting points for what to review so that you could fix the site and bring it back into line with Google’s guidelines.

Why the help?

Cutts seemed to understand there was a need for the webmaster. After all, Google’s Search is not their product — the sites they return from that search are the product.

Without someone translating Google’s desires to site owners, those sites would likely not meet those guidelines very well. This would result in a poor experience for Google users. So, that transfer of knowledge between Google, SEOs and site owners was important. Without it, Google would be hard-pressed to find a plethora of sites that meet its needs.

Then, things changed. Matt Cutts left to go to the US Digital Service — and with his departure, that type of communication from Google ended, for the most part.

While Google will still let webmasters know about really big changes, like the mobile-first index, they’ve stopped communicating much detail about smaller updates. And the communication has not been in such an easily consumable format as Cutts tweeting update metrics.

In fact, very little is said today about smaller updates. It has gotten to the point where they stopped naming all but a very few of these changes.

Google communication in 2017

Right now, the Google spokespeople who primarily communicate with SEOs/webmasters are Gary Illyes and John Mueller. This is not a critique of them, as they communicate in the way Google has asked them to communicate.

Indeed, they have been very helpful over the past few years. Mueller holds Webmaster Central Office Hours Hangouts to help answer questions in long form. Illyes answers similar questions in short form on Twitter and attends conferences, where he participates in various AMA (Ask Me Anything) sessions with interviewers.

All this is helpful and appreciated… but unfortunately, it is not the same.

Highly specific information is difficult to find, and questioners are often are met with more vagueness than specifics, which can at times feel frustrating. Google has become obtuse in how they communicate with digital marketers, and that seems to be directed by internal company processes and policies.

This lack of algorithmic specificity and update confirmation is how we wound up with Phantom.

Welcome, Phantom

Google has many algorithms, as any SEO knows. Some, like Penguin and Panda, have been rolled into Google’s core algorithm and run in (quasi-) real time, while others, like the interstitial penalty, still run, well, when they run.

Big updates such as Penguin have always been set apart from the day-to-day changes of Google. There are potentially thousands of tweaks to core algorithms that run every year and often multiple times a day.

However, day-to-day changes affect sites much differently than massive algorithm updates like Panda, Penguin, Pigeon, Pirate, Layout, Mobilegeddon, Interstitial, and on and on. One is a quiet rain, the other a typhoon. One is rarely noticed, the other can be highly destructive.

Now, Google is correct in that webmasters don’t need to know about these day-to-day changes unless someone dials an algorithm up or down too much. You might not ever even notice them. However, there are other algorithms updates that cause enough disruption in rankings for webmasters to wonder, “Hey Google, what happened?

This was true for an algorithm update that became known as Phantom.

Phantom?

There was a mysterious update in 2013 that SEO expert Glenn Gabe named “Phantom.” While it seemed to be focused on quality, it was not related to Panda or Penguin. This was new, and it affected a large number of sites.

When “Phantom” ran, it was not a minor tweak. Sites, and the sites that monitor sites, would show large-scale ranking changes that only seem to happen when there is a major algorithm update afoot.

Now, there was one occasion that Google acknowledged Phantom existed. However, aside from that, Google has not named it, acknowledged it, or even denied Phantom when SEOs believed it ran. Over time, this string of unknown quality updates all became known as Phantom.

The word “Phantom” came from the idea that we didn’t know what it was; we just knew that some update that was not Panda caused mass fluctuations and was related to quality.

Not Panda quality updates

The changes introduced by Phantom were not one set of changes like Panda or Penguin, which typically target the same items. However, the changes were not completely disparate and had the following in common:

  • They were related to site quality.
  • They were not Panda.
  • They were all found in the Quality Raters Guide.

We don’t use the word “Phantom” anymore, but from 2013 to 2016, large-scale changes that were quality related and not Panda were commonly called Phantom. (It was easier than “that update no one admits exists, but all indicators tell us is there.”)

You can’t have so many sites shift that dramatically and tell SEOs the update does not exist. We all talk to each other. We know something happened. Not naming it just means we have to “make up” (educated guess) what we think it might be.

And from this mysterious Phantom, Fred was born.

‘Hello, Fred!’

In early March, 2017, the SEO world was rocked by a seemingly significant algorithm update that appeared to target link quality. Google, however, would not confirm this update, deflecting questions by responding that Google makes updates to its core algorithm nearly every day.

When Search Engine Land’s Barry Schwartz asked Gary Illyes if he cared to name the unconfirmed update, he responded jokingly:

‘Fred’ is more than a funny joke

Of course, Fred is not just a funny thing that happened on Twitter, nor is it just the default name for all Google’s future updates. In fact, it is not actually that funny when you break down what it really means. Fred is representative of something far deeper: Google’s historically unstated “black box.”

Now, Google does not use the term “black box,” but for all intents and purposes, that is exactly what “Fred” represents to webmasters and SEOs.

Meet Google’s black box

A black box is when a system’s inputs and outputs (and their general relationships) are known, but

  • internal structures are not well understood (or understood at all);
  • understanding these structures is deemed unnecessary for users; and/or
  • inner workings are not meant be known due to a need for confidentiality.

To this end, Google has also communicated to SEOs through different channels that they are acting from a black box perspective — the way they used to before Matt Cutts took over Webmaster communications.

We have been told we don’t need to understand the algorithms. We have been told that this knowledge is not necessary to do the work. We have been told that all we need to do to be successful is be awesome. “Awesomeness” will get us where we need to be.

This all sounds good. It really does. Just be awesome. Just follow the Webmaster guidelines. Just read the Google Quality Rater’s Guide. You will be set.

Of course, the devil is in the details.

What does ‘awesome’ mean?

Follow the Webmaster Guidelines. Read the Quality Rater’s Guide. Follow these rules for “awesomeness.”

While that advice can help an SEO become awesome on a basic level, it can’t tell one what to do when there is a complex problem. Have a schema implementation issue? What about trying to figure out how to properly canonical pages when doing a site modification or move? Does being awesome tell me how to best populate ever-changing news sitemaps? What about if you get a manual action for that structured data markup because you did something wrong? What about load times?

There are a lot of questions about the million smaller details that fall under “being awesome” that, unfortunately, telling us to “be awesome” does not cover.

This is where the black box becomes potentially detrimental and damaging. Where do you get information about site changes once you have passed the basics of the Webmaster Guidelines and Quality Raters Guide? You saw a change in your site traffic last week; how do you know if it is just your site or an algorithm update if Google won’t tell you?

Being awesome

Google no longer wants SEOs to worry about algorithms. I get it. Google wants you to just be awesome. I get that, too. Google does not want people manipulating their algorithms. Webmaster Guidelines were first written to help stop spam. Google just wants you to make good sites.

The issue is that there still seems to be an unspoken assumption at Google that anyone who wants information about algorithm updates is just trying to find a way to manipulate results.

Of course, some do, but it should be noted most people who ask these questions of Google are just trying to make sure their clients and sites meet the guidelines. After all, there are multiple ways to create an “awesome” website, but some tactics can harm your SEO if done improperly.

Without any confirmations from Google, experienced SEOs can be pretty sure that their methods are fine — but “pretty sure” is not very comforting when you take your role as an SEO seriously.

So, while “being awesome” is a nice idea — and every site should strive to be awesome — it offers little practical help in the ever-changing world of SEO. And it offers no help when a site is having traffic or visibility issues.

So, why is this important?

The lack of transparency is important for several reasons. The first is that Google loses control over the part of product it has never controlled: the websites it delivers in search results. This is not a concern for site owners, but it seems the ability to actively direct sites toward their goals would be something Google would value and encourage.

They have added Developer Guides to make finding SEO/webmaster information easier, but these only help SEOs. Site owners do not have time to learn how to write a title tag or code structured data. These guides also are very high-level, for the most part — they communicate enough to answer basic questions, but not complex ones.

In the end, Google hurts itself by not communicating in greater detail with the people who help affect how the sites in their search results work.

If it is not communicated to me, I cannot communicate it to the client — and you can be assured they are not going to the Developers site to find out. I can also tell you it is much harder to get buy-in from those at the executive level when your reasoning for proposed changes and new initiatives is “because Google said to be awesome.”

If Google doesn’t tell us what it values, there’s little chance that site owners will make the sites Google wants.

Why else?

SEOs are not spammers. SEOs are marketers. SEOs are trying to help clients do their best and at the same time achieve that best by staying within what they know to be Google’s guidelines.

We work hard to keep up with the ever-changing landscape that is SEO. It is crucial to know whether a site was likely hit by an algorithm update and not, say, an error from that last code push. It takes a lot more time to determine this when Google is silent.

Google used to tell us when they rolled these major algorithm updates out, so it gave you parameters to work within. Now, we have to make our best guess.

I think it would be eye-opening to Google to spend a week or so at different SEOs’ desks and see what we have to go through to diagnose an issue. Without any clear communication from Google that something happened on their end, it leaves literally anything that happens on a website in play. Anything! At least when Google told us about algorithmic fluctuations, we could home in on that.

Without that help, we’re flying blind.

Flying blind

Now, some of us are really experienced in figuring this out. But if you are not a diagnostician — if you do not have years of website development understanding, and if you are not an expert in algorithms and how their changes appear in the tools we use — then you could find yourself barking up a very wrong tree while a crippled site loses money.

Every experienced SEO has had a conversation with a desperate potential client who had no idea they were in violation of Google’s guidelines — and now has no money to get the help that they need because they lost enough search visibility to severely hamper their business.

And that leads me to the last but most important reason that this black box practice can be so damaging.

People

People’s livelihoods depend on our doing our job well. People’s businesses rely on our being able to properly diagnose and fix issues. People’s homes, mortgages and children’s tuition rely on our not messing this up.

We are not spammers. We are often the one bridge between a business making it and employees winding up on unemployment. It may sound hyperbolic, but it’s not. I often joke that 50 percent of my job is preventing site owners from hurting their sites (and themselves) unknowingly. During earlier versions of Penguin, the stories from those site owners who were affected were often heartbreaking.

Additionally, without input from Google, I have to convince site owners without documentation or confirmation backup that a certain direction is the correct one. Can I do it? Sure. Would I like it if Google did not make my job of convincing others to make sites according to their rules that much harder? Yes.

Will Google change?

Unlikely, but we can hope. Google has lost sight of the very real consequences of not communicating clearly with SEOs. Without this communication, no one wins.

Some site owners will be lucky and can afford the best of the best of us who don’t need the confirmations to figure out what needs to be done. But many site owners? They will not be able to afford the SEO services they need. When they cannot afford to get the audit to confirm to them that yes, Google algorithms hurt your site, they will not survive.

Meanwhile, we as SEOs will have difficulties moving the needle internally when we cannot get buy-in from key players based on the idea of “being awesome.” Google will lose the ability to move those sites toward their aims. If we are not communicating Google’s needs to site owners, they will likely never hear about them. (There is a reason so many sites are still not mobile-ready!)

Is that black box worth it to Google? Perhaps. But is being obtuse and lacking in transparency truly beneficial to anyone in the long run?

It seems there are better ways to handle this than to simply direct everyone to make “awesome” sites and to read the Webmaster Guidelines. We are professionals trying to help Google as much as we are asking them to help us. It is a partnership, not an adversarial relationship.

No one is asking for trade secrets — just confirmation that Google made a change (or not) and generally what they changed.

It is like feeling really sick, going to the doctor, and he tells you, “Well you have a Fred.”

You ask the doctor, “What can I do for a case of ‘Fred?'”

He looks at you and says, “Easy! Just be awesome!” And then he walks out the door.

Well, you think, at least I have WebMD.


In the meantime, here are some ideas of how you can work with Fred and Google’s black box.

The post The trouble with ‘Fred’ appeared first on Search Engine Land.



from SEO Rank Video Blog http://ift.tt/2fxtbc8
via IFTTT

Search in Pics: Google balloon statues, a Fiat Polski car & massage chair

In this week’s Search In Pictures, here are the latest images culled from the web, showing what people eat at the search engine companies, how they play, who they meet, where they speak, what toys they have and more.

Google Polski Fiat car:


Source: Instagram

Google massage chair:


Source: Instagram

Google balloon statue art:


Source: Instagram

Google Top Contributor summit in Singapore:


Source: Instagram

The post Search in Pics: Google balloon statues, a Fiat Polski car & massage chair appeared first on Search Engine Land.



from SEO Rank Video Blog http://ift.tt/2xLvrXb
via IFTTT

Bing Ads adds URL tracking parameters for locations & extensions

To help advertisers get more information about the source of their ad clicks, Bing Ads has introduced more URL tracking parameters and updated an existing parameter.

Updates to the {TargetID} parameter allow it to capture data for custom and in-market audience lists and targeted location IDs. TargetID already returns the ID of the keyword, remarketing list, dynamic ad target or the product partition.

Three new parameters can be appended to URLs to see which ad extensions received clicks and the location of the users who click.

The new parameters:

  • {feeditemid}: The ID of the ad extension that was clicked.
  • {loc_physical_ms}: The geographical location code of the physical location of user that clicked.
  • {loc_interest_ms}: The geographical location code of the location of interest that triggered the ad.

The post Bing Ads adds URL tracking parameters for locations & extensions appeared first on Search Engine Land.



from SEO Rank Video Blog http://ift.tt/2fxkmiw
via IFTTT

Thursday, September 28, 2017

SearchCap: Google bugs, Bing Ads conversions & Google local finder mentions

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

Industry

Local & Maps

Link Building

Searching

SEO

SEM / Paid Search

The post SearchCap: Google bugs, Bing Ads conversions & Google local finder mentions appeared first on Search Engine Land.



from SEO Rank Video Blog http://ift.tt/2xJvliN
via IFTTT

Make user-generated content your brand’s secret weapon

Your customers’ own words are more important to your brand than any marketing tagline you can write. More than 90 percent of consumers say they trust recommendations from others — even people they don’t know! — over branded content.

Join Marty Weintraub, founder of aimClear, and Janelle Johnson, VP of demand generation at BirdEye, as they show you how to proactively leverage customer reviews, ratings and social media comments. They’ll share best practices for using positive reviews, as well as how to turn negative user-generated content (UGC) into brand-building opportunities.

Attend this webinar and learn:

  • tips to capture authentic customer feedback in near-real time.
  • techniques to effectively share customer feedback on social channels and websites.
  • proactive steps to use UGC to sell your products and services.

Register today for “Make UGC Your Brand’s Secret Weapon,” produced by Digital Marketing Depot and sponsored by Birdeye.

The post Make user-generated content your brand’s secret weapon appeared first on Search Engine Land.



from SEO Rank Video Blog http://ift.tt/2ftU7JH
via IFTTT

Bing Ads announces new Editor features for ad extensions

Bing Ads announced on Thursday some new features in Editor related to ad extension management.

These new features are now available for the Windows version of Bing Ads Editor, and they will soon be available for the Mac version.

Ad extension scheduling

Advertisers can control when their ad extensions are shown in Editor, including by date, day of the week and time of day. There’s also support for times to be based on either the advertiser’s time zone or the searcher’s time zone.

These scheduling options allow advertisers to run ad extensions only during appropriate times and help enhance relevance. For example, a restaurant may want to have callout extensions enabled for the morning to highlight breakfast items, and then schedule them to turn off when breakfast stops serving and turn on callouts for dinner promotion.

The new scheduling options displayed when extensions are highlighted in Editor:

bing ads editor extension scheduling

Clicking Edit in the Ad Schedule section of the editing pane brings up a scheduling window:

bing ads editor schedule pane

 

Shared Library support for call and location extensions

Bing has also upgraded the management capabilities of their Shared Library to include location and call extensions. Advertisers can now create a library for each and quickly associate them to multiple campaigns in a few clicks.

This will save time spent going into individual campaigns one by one to assign these extensions, and it makes mass edits easier. For example, if a phone number changes, as the number in the Shared Library is updated, the new number will automatically populate in all campaigns using it.

These options now appear in the Shared Library section in the left-hand margin:

 

bing ads editor call library

The post Bing Ads announces new Editor features for ad extensions appeared first on Search Engine Land.



from SEO Rank Video Blog http://ift.tt/2yvC4tW
via IFTTT

Canonical tags gone wild

Being a technical SEO, I love digging into any weird problems where things don’t seem to work as expected.

Canonical tags seem easy enough, but these tags cause all sorts of interesting issues — and some minor fixes can lead to big wins. Almost every major website will have some kinds of issues with their canonical tags, so I dug into a few different ones to see what examples I could find.

Canonical tags thrown into the <body>

In my recent post, “Canonical tags are easy, right?,” I gave an example of a canonical tag that looks fine if you view the source code, but if you use “Inspect” in Chrome Dev Tools to view the DOM tree, you’ll see that the <head> section of Home Depot’s website breaks early and the canonical tag is thrown into the <body> section, where Google will ignore it.

Canonical tag in Body

What’s the worst that could happen if all of your canonical tags are ignored? You won’t have control over the preferred version or consolidation of signals. Many pages will be indexed with the wrong version, or you may have multiple versions of the same page indexed without consolidating the signals, and no version of that page will rank as high as it should.

Here are a few different searches in Google that show parameters on Home Depot’s website that are getting indexed even though they have a canonical set to the clean URL:

An interesting note is that the canonical tags seem fine on Home Depot’s mobile website. Likely, one of the scripts they are calling on the desktop version of the site is causing the issue, but the problem will resolve itself with the upcoming mobile-first index.

If Home Depot wanted to fix this sooner, they can probably get away with moving the canonical tag in the <head> section so it’s above all the scripts or figure out what is causing the <head> section to close early (which is likely a tag that wasn’t closed properly).

Canonical tags when every version references itself

What happens when you have multiple versions of the same page, and each version has a canonical tag that says it is the correct version? The answer is that Google will choose one, or both, but it probably won’t be consistent.

That’s exactly what happens on Meetup.com. Meetup pages have at least two versions used interchangeably: one with the name of the Meetup in mixed case as it was entered and one all lowercase. Mixed case of any kind works for the URLs on Meetup.com; try making any of the lowercase characters capitalized, or vice versa.

So, if we have two versions that both work, and both say they are correct, what happens?

<link rel=”canonicalhref=”http://ift.tt/2wZtZwx; />

<link rel=”canonicalhref=”http://ift.tt/2yaDkXq; />

In this case, both pages are being indexed, but only one will show. Both versions have links, and the equity is currently split. I have added &filter=0 to the Google search in the screen shot below so that I could show that both are indeed indexed. You can also check by doing an info: command for the different URLs to see the canonicalized version.

Raleigh SEO Meetup SERPs showing duplicate content filtering

To recap: Both versions of the page are indexed, both have links, and only one can show. A quick fix from Meetup here could consolidate a lot of signals that are currently split, and they would likely see a large traffic increase.

Forgot to include the canonical tag or included the wrong page

A quick search for “sams club tires” will show you both desktop and mobile versions of the samsclub.com tires page. The problem here is that the m.samsclub.com/tires page does not have a canonical tag at all, allowing both pages to show.

Even if the m. page indexed had the canonical tag in place, I don’t think this one would work correctly — the desktop site references a different mobile page as the alternate (http://ift.tt/2fB7Ceq), and that page 302 redirects to the m. page shown in SERPs above (http://ift.tt/2yufi5S).

Having the mismatch on alternate versions is a common problem on their site, and m.samsclub.com shows in many desktop search results because they aren’t indicating the connections in the way they need to in order to consolidate the pages.

Without establishing the relationship between desktop and mobile versions of the page, they will be treated as separate entities — both can be shown in search results, and neither will rank as high as they should if this was done correctly.

When I started writing this, it looked like there were also some issues around canonicalization with what appeared to be a dev server. The canonical tags were using the subdomain of the dev server, which was prod-i.samsclub.com. These pages were getting indexed and sometimes being chosen as the version to show, even for the home page of the website.

It looks like they have recently fixed this as they redirected prod-i.samsclub.com to www.samsclub.com, but you can still see many of these pages in the index with a site: search, and their cached versions still show the incorrect canonical tag. If you’re going to expose an environment like that, I’d highly recommend using server-side authentication so search engines won’t be able to crawl it in the first place to avoid problems like this.

Another potential disaster is copying a page and not changing the canonical or accidentally setting a section or even an entire website to canonical to a single page. While some of these will be ignored, others may be respected, and you could see a decline in traffic for many pages.

Canonical tags with URL parameters

There are lots of ways canonical tags can go wrong when you have multiple versions of a page. First up, if you have multiple versions of a page and no canonical, then what happens? That’s right, you get multiple versions indexed.

A more interesting question might be, what happens when you have a separate mobile version that has parameters? When you are connecting, say, an m. mobile site and the desktop version, then you have to specify the alternate version of the page on the desktop site and canonical from the mobile site back to the desktop.

What happens when you only link to one version of the mobile site, but URL parameters make it so there is more than one version of the page? The others get indexed, as they have with this page — site:samsclub.com pretend play inurl:1938.

Did you know there’s also a tool in Google Search Console for handling parameters?

canonical tags and e-commerce optimization

Canonical tags ignored

Remember that canonical tags are a hint, not a directive. They’re made to be used for duplicate versions of pages, and you can get away with nearly duplicate versions in many cases. If the page you set as the canonical is too different from your target page, the canonical will likely be ignored.

This happens with the channels page under YouTube user accounts; just check out site:youtube.com inurl:channels. In some situations, other signals might overpower canonical tags as well. Things like how URLs are submitted in the sitemap and how the pages are internally linked are other signals, and Google also has preferences for things like HTTPS versions and shorter URLs.

Canonical tags with other tags

Canonical tags can have all kinds of issues when used with other tags. I would say don’t point the canonical on page 2 to page 1 in a paginated set, don’t use noindex on pages with a canonical tag, and be very careful with hreflang tags since each page needs to be the indexed version. There are tons of other problems that happen when canonical tags interact with other tags.

Canonical tags and redirects

It’s generally a bad idea to canonical to a page that redirects. This usually breaks something or consolidates signals inconsistently. Take Amazon stores, for instance, where there are lots of redirects and weird canonicalization happening.

Look what happens, and notice that at each step there are pages indexed and that the URLs might use the clean name or the store IDs or both.

  • http://ift.tt/2fCaMhG
  • 301 > http://ift.tt/2yvE3P5
  • 302 > http://ift.tt/2fB7CLs
  • 302 > http://ift.tt/2yw4JPG
  • 301 > http://ift.tt/2fB7DPw
  • 301 > http://ift.tt/2yvamNZ
  • This last one is the version where many of the Amazon store listings are, but we’re not done yet.
  • This page has a canonical set as http://ift.tt/2fB7Emy (not the same as the URL).
  • That URL does a 302 redirect to http://ift.tt/2yvkM07
  • And finally, that page has a canonical set as https://www.amazon.com/.

Canonical tags always create the most interesting issues, and things don’t quite work out the way you would expect. I’d bet that some of these pages end up canonicalizing to that final Amazon home page version and give the home page a bit of a boost.

The point is that the canonical tag is powerful, and it can go wrong easily — so double-check your website to see what kinds of issues there might be.

Check your canonical tags

I found most of my examples in the article with a simple site: search of the domain in Google and maybe removed filtering, as with the Meetup example, or searched for an individual product or just something I saw in one of the title tags to see if there were other versions.

None of these examples took long to find, and I didn’t even use a crawler, but you definitely should use a crawler when looking for issues on your own website. I would expect any major website out there to have more than a few examples of canonical tags gone wild.

The post Canonical tags gone wild appeared first on Search Engine Land.



from SEO Rank Video Blog http://ift.tt/2wZsrTr
via IFTTT

Bug drops Sitelinks Searchbox from Google search results

A bug in Google has wiped out a search feature that allows searchers to search within a specific site from the Google search result snippets. The feature launched three years ago, and it is officially called a Sitelinks Searchbox, where Google will show a search box directly within a search result snippet, allowing the searcher to restrict their search results to pages within the site listed.

Here is what it looked like on a search for “YouTube” when the feature launched.

Now, the Sitelinks Searchbox is no longer showing, at least temporarily. RankRanger caught the feature dropping out of Google globally a couple of days ago.

Google is working on it, but there is no estimated timeline for a fix to be rolled out.

The post Bug drops Sitelinks Searchbox from Google search results appeared first on Search Engine Land.



from SEO Rank Video Blog http://ift.tt/2hBZ5Ir
via IFTTT

Bing Ads rolling out offline conversion imports to capture impact of ads on offline sales

Advertisers that run Bing Ads campaigns to generate leads can now upload offline conversion data back into the platform to get a more complete picture of campaign impact on actual sales.

How Bing Ads offline conversion tracking works

The system leverages the Microsoft Click ID (MSCLKID) that gets appended to every Bing Ads URL when an advertiser enables the new Offline Conversion Import feature. That click ID gets passed in the referring URL and can be stored either in a cookie or remain persistent in the URL as a user browses the website. The click ID then gets passed into the advertiser’s CRM system when a user fills out a lead form on the website and remains associated with that user. If the user converts offline after speaking with a sales agent, the conversion can connect back to the click ID.

When an advertiser uploads that offline conversion data back into Bing Ads, the conversions appear at all levels of conversion reporting.

Setup required

There are several steps advertisers need to take to enable offline conversion tracking.

The first step is to create an Offline Conversion goal, which is now an option in the list of conversion goal types. After the goal is created, Bing Ads will automatically turn on auto-tagging in the account to append the Microsoft Click ID to ad URLs.

Advertisers then need to update the tracking code on their websites in order to capture the click ID and store it with the lead information in their CRM systems.

Offline conversion tracking is rolling out globally over the next few weeks.

The post Bing Ads rolling out offline conversion imports to capture impact of ads on offline sales appeared first on Search Engine Land.



from SEO Rank Video Blog http://ift.tt/2xDFagr
via IFTTT

7 on-site SEO problems that hold back e-commerce sites

Not long ago, I talked about 16 very specific on-site SEO mistakes that I see very often, and how to fix those issues.

Today, I want to shift the focus toward problems that plague e-commerce sites specifically. I’ll also be addressing on-site problems that have a bit more to do with strategy and a bit less to do with specific technical mistakes.

Finally, I wanted to make sure we had some real-world examples to refer to, so I mined case studies from the industry to demonstrate the concrete impact these changes can have on your search traffic.

Let’s take a look at these problems and what you can do to resolve them.

1. Weak product descriptions (or none at all)

Since e-commerce sites usually have a very large number of products, it’s common for product descriptions to be short, automated and provided by the manufacturer.

Unfortunately, this creates a few problems for SEO:

  • Short descriptions give the search engines very little content to work with, and this is a problem. After analyzing 1 million search results, Backlinko concluded that longer content generally performs better: The average Google first page result is 1,890 words long.
  • Automated descriptions that swap a few words into a template can create duplicate content issues.
  • Descriptions provided by the manufacturer are almost certainly replicated on other sites, meaning that you are not providing anything unique for the search engines to index. This means the search engines have no reason to rank you above competitors.

It’s not always possible to manually update the product descriptions for every page on your site, but this action isn’t strictly necessary to resolve these issues. A focus on turning just a few of the highest-value product pages into full-fledged landing pages with conversion-based copy can have a dramatic effect on the rest of the site.

An Australian retailer named Toy Universe was able to increase its search engine traffic by 116 percent in just four months. That doubling in traffic also doubled sales. While many changes were involved in that boost, a large chunk of the effort went toward work on the product descriptions.

Put simply, the site did not originally feature any unique product descriptions or unique content for the category pages. Adding them in was a huge piece of the puzzle.

The Motor Bookstore serves as another classic example.

When Google first released the Panda update, this online automotive bookseller saw a 38.5 percent drop in organic search traffic overnight. The brand was well respected by its customers, but their product descriptions were supplied by the publishers; as a result, those descriptions were identical to the descriptions found on many other sites.

That duplicate content didn’t look good to Google — hence the drop in traffic after Panda was introduced. (These days, Panda is baked into Google’s core ranking algorithm, so your site could be affected by it without your knowing.)

Eliminating duplicate product descriptions and replacing them with unique descriptions can help resolve this issue. Opening up your site to user-generated content like reviews can also help by introducing new content to reduce the proportion that is duplicate — a strategy that has obviously worked wonders for Amazon.

On that note…

2. Not including user reviews

In addition to diluting duplicate content, user reviews seem to affect search results in other ways. It’s not entirely clear whether the presence of reviews affects search engine results directly or indirectly, but the impact is clear and unambiguous.

Yotpo conducted an analysis of over 30,000 businesses that added user reviews to their site and measured how this impacted organic search traffic. The results were stark: Over a period of nine months following review implementation, they found that Google organic page views per month grew by over 30 percent.

Including user reviews can be scary, as this allows buyers to leave negative feedback on your products. But there is a wealth of evidence that including user reviews increases conversion rates. In fact, in a bizarre twist of fate, more diverse product ratings improve conversions better than five-star-only reviews.

If you’ve been hesitating to include user reviews because of concerns about negative feedback or due to the difficulties of implementation, I highly recommend you take the plunge now. The impact is almost sure to be positive.

3. No unbranded keyword optimization

Perhaps one of the most common issues is that many e-commerce product pages are simply not developed with keywords in mind.

The typical product page is built around a brand and model name. It’s certainly true that some consumers may be searching for these names, and they should definitely be included in the title tag and the other important locations on the page.

That said, most consumers are likely not searching by brand or model name, especially when it comes to more obscure brands.

For that reason, it’s important to include more generic, popular phrases on your pages as well.

This isn’t to say that you should abandon any more niche keyword usage. What I mean is that you should be going after phrases that consumers are using when they search for products like yours, and that means going deeper than branding to focus on the actual mechanics of the consumer journey.

White Hat Holsters did just that, and the result was a 100 percent increase in sales and a 400 percent increase in search engine traffic. The traffic grew from 2,000 to 8,000 visits per week in just eight months.

To accomplish this, they:

  1. used keyword tools and competitive research to identify phrases that consumers were actually using to find products like theirs.
  2. analyzed the meta descriptions, image alts, URLs and headings.
  3. chose three to five closely related keywords to target for each page and updated the above-the-fold region of the pages to semantically reflect those keywords.
  4. created a blog to capture keywords searched for by consumers who were further up the funnel.

4. Focusing too heavily on transactional keywords and not developing informational content

It’s incredibly difficult to rank for “money” keywords, and it’s usually a failing strategy to focus too heavily on them, especially if this means you are neglecting the informational keywords that target customers who are a bit further up the funnel.

By shifting some of your attention toward less transactional keywords and toward more informational ones, you can rank for less competitive keywords and build a stronger reputation with the search engines.

Ranking for these less competitive phrases doesn’t just add traffic for those individual phrases; it can also improve your site’s overall reputation with the search engines. This may be because it influences behavioral metrics. Whatever the cause, I’ve personally witnessed this effect many times.

Darren Demates helped a medical e-commerce site skyrocket its search traffic by an incredible 1,780 percent using an interesting keyword method he calls the “double jeopardy technique.” Here was his process:

  1. He obtained keywords by adding the page URLs into the Google keyword planner, instead of the usual method of guessing keywords and looking at the related suggestions. He also put the focus on informational keywords instead of transaction keywords.
  2. He used SEMrush to find the keyword difficulty for the keywords recommended by the Google keyword planner, then weighed the difficulty against the potential traffic in order to make a judgment call about which keywords to focus most heavily on.
  3. He used a “site:example.com [keyword]” search to identify which page on the site already had the most ranking potential for that keyword.
  4. He searched forums using an “inurl:forum [keyword]” search to find the types of questions people were asking about his informational keywords.
  5. He updated the thin blog posts on his site to answer all of the questions he could find on forums that people had about the keywords.

I’d recommend taking notes here and putting this to use. The possibility of increasing search engine traffic by an entire order of magnitude isn’t the kind of thing you want to ignore.

5. Implementing poorly planned site redesigns

This one hurts to watch.

I’ve had clients rush ahead with a site design without notifying me, and I’ve had new clients who approached me after a site update tanked their rankings.

This experience is incredibly painful, because a site redesign intended to modernize and beautify a site, or to implement changes expected to maximize conversions, can end up obliterating your organic search traffic. Few things hurt more than dropping a wad of cash on something and having it backfire on you.

If you implement a site redesign without taking SEO into account, this is almost bound to happen. Pages that ranked well can get lost, content that was pulling in traffic can get rearranged, and the results of past wins can get lost.

Seer Interactive assisted one retail client who had redesigned their site in order to secure the site with HTTPS. Their redesign caused their organic traffic to plummet by a staggering 75 percent. The situation was so bad that they no longer ranked for their own brand name.

What happened?

  • The redesign deleted several key pages that had been pulling in traffic. This didn’t just cause the traffic from those pages to get lost, it also created 404 errors where other pages on the site still linked to the missing pages. This can cause PageRank to drop like a brick.
  • The site had a new URL structure, meaning all of the links pointing to the old pages were now pointing to nothing, and all of the authority the site had built up in the past was tossed right out the window.
  • The redesign introduced copies of pages, producing duplicate content that may have caused the site to be algorithmically penalized.

After fixing those issues and introducing a long-term content strategy, the site experienced a 435 percent growth in search traffic. This led to a 150 percent increase in transactions, and a 64 percent increase in revenue. This was accomplished in six months.

Do not execute a site redesign without the help of an SEO professional. The results can be absolutely horrifying.

6. Poor migration between e-commerce platforms

It’s a safe bet that most e-commerce sites are built using third-party platforms. This is a mutually beneficial arrangement that allows the e-commerce business to focus more on its core business and less on web development.

It’s not uncommon for a site to outgrow one platform and switch to another as their market share increases, or to switch platforms in order to gain access to previously unavailable features. Unfortunately, switching e-commerce platforms can sometimes hurt rankings.

In one case, TotalHomeSupply.com found itself losing 37 percent of its traffic after switching from Volusion to Mozu. Despite Mozu being owned by the same company and serving as the enterprise-level version of the same platform, the transfer led to technical SEO issues. (This isn’t a knock against Volusion — this can happen with any e-commerce platform if you aren’t careful.)

The drop in traffic led to a 21 percent drop in transactions, despite a 24 percent boost in the conversion rate that Mozu may have contributed to.

The issue with Mozu was that the pagination was handled by JavaScript instead of HTML. Inflow worked together with Mozu to eliminate the JavaScript issues, allowing Google to properly crawl the pagination, which was invisible to the search engines when JavaScript was involved.

In addition, they trimmed thin content that had led to a demotion from Google’s Panda update and introduced new, high-quality content.

The result was a doubling in year-over-year organic revenue and a restoration of organic traffic to levels higher than before the site migration.

As with site redesigns, make sure an SEO professional is involved any time you update your e-commerce platform.

7. Not optimizing for your most promising keywords

In the section on informational keywords, we mentioned Darren Demates’ “double jeopardy” technique. In addition to focusing on information keywords, part of the reason for the strategy’s success also lies in the fact that it leverages the keywords that already show promise.

We mentioned that his technique involves performing a site: search to identify which page on the site already ranks best for any given keyword.

A related method of identifying keywords is to analyze your existing rankings to see which keywords are already performing well, and to make changes in order to better optimize for those keywords.

This is what Digital Current did for Sportsman’s Warehouse.

They identified “low-hanging fruit” pages which were already ranking fairly well for keywords, then updated those pages by tweaking the titles, headers and content in order to better reflect the keywords. They were careful to focus on keywords which would be “in season” shortly after the changes were made.

In addition, they performed some link building and improved the quality of the on-site content.

The changes resulted in a 31 percent year-over-year increase in organic search traffic and a 25 percent year-over-year increase in organic search revenue. This was a tripling on the ROI they had paid for the SEO work.

There are two primary methods you can use in order to optimize for promising keywords. The first is to run your URLs through the Google Keyword Planner as in the “double jeopardy” technique.

The second is to look at your keyword rankings in the Google Search Console or a tool like SEMrush to identify keywords that are already ranking on the second page or so. If these keywords are ranking without having already been optimized, they are a golden opportunity, and you should capitalize on them by updating your titles, headings and content.

This second approach is sometimes called the “low-hanging fruit” technique.

In the process, it’s important to make sure you aren’t cannibalizing your rankings for more important keywords, and, of course, to verify that the changes in the content will be useful for users and will not detract from the primary message of the existing page.

Time to put this information to use!

Don’t close that browser tab just yet. Leave it open and start taking a look at your site. Take a look at the problems I’ve listed here, and ask yourself if you’re facing any of them right now. You’ll thank me when your search traffic starts climbing.

The post 7 on-site SEO problems that hold back e-commerce sites appeared first on Search Engine Land.



from SEO Rank Video Blog http://ift.tt/2hz9gNY
via IFTTT

Google local finder rolls out website mentions matching your query

Sergey Alakov first noticed that Google has begun rolling out the local finder website mentions that Google began testing earlier this month.

Website mentions in the local finder show the searcher if their query, the search they used to bring up the local results in Google, matches any of the content found on the local business website. Now it seems to be live for all, and I am able to replicate the “Their website mentions” notations myself. Here is a screen shot:

The interesting part about this feature is that is demonstrates that Google does crawl — and can surface — local website content, yet the company has never explicitly said that site content is a factor in local rankings. As I’ve said before, Google has never confirmed that local ranking listings use signals from the content of the local business website. In fact, its guidelines and tips make no mention of content on the local business website.

The post Google local finder rolls out website mentions matching your query appeared first on Search Engine Land.



from SEO Rank Video Blog http://ift.tt/2ybfNpg
via IFTTT