Beginning in July 2022, advertisers will no longer be able to create new ETAs or edit existing ETAs in Google Ads according to an announcement by Sylvanus Bent, Product Manager, Google Ads . “Your existing expanded text ads will continue to serve alongside responsive search ads, and you’ll still see reports on their performance going forward. Additionally, you’ll be able to pause and resume your expanded text ads or remove them if needed. You’ll also still be able to create and edit call ads and Dynamic Search Ads,” said Bent.
We’re continuing to expand this story as we learn more.
Use ETA info for RSAs in the transition. “To prepare for this change, we recommend that you have at least one responsive search ad in every ad group in your Search campaigns by June 30, 2022,” Bent suggested. Google’s announcement also includes ways advertisers can repurpose their ETA content for RSAs. Recommendations include the following:
Repurpose high-performing content from your expanded text ads and focus on Ad strength
Pin headlines or descriptions to specific positions in your responsive search ads
Evaluate the success of your ads based on incremental impressions, clicks, and conversions your ad groups and campaigns receive
Though take that last recommendation with a grain of salt. “If you’re in an industry where your ad must contain certain pieces of text in very specific locations, pin away! But if it’s not required, we found that attempting to control the machine by telling it what text to pin to certain ad positions was usually detrimental to results,” wrote Frederick Vallaeys, Cofounder of Optmyzr, for Search Engine Land.
This “stinks for anyone in highly regulated fields. Would be nice if the RSA data was usable or scientific instead of a generic ‘Best’ or ‘Good’ & knowing combos that work together would help,” wrote Gregg Finn, Partner & Digital Marketer at Cypress North, in a tweet.
“Don’t love this, but as long as they don’t take away the option to pin headlines/descriptions in RSAs we can at least approximate the control we have had with ETAs,” added Tim Jensen, PPC Campaign Manager at Clix Marketing.
Why we care. This is the latest move that Google is making to push automation through their ad products. The announcement says that, “15% of search queries every day are new searches we’ve never seen before” and therefore “Automation is key to keeping pace with these trends.” Many advertisers do use RSAs, but they also like having the control and capabilities that ETAs offer. The future phase-out of ETAs means advertisers are moving further away from direct control over their accounts and having to work with the Google Ads machine learning and AI.
Before the sunset is complete, we recommend testing your ETA ad pieces in RSAs and figure out what works best so you’re not cut off completely from new ad creation when Google Ads stops allowing new ETAs. “If you’re evaluating RSAs on incrementality, their conversion rates might be lower than ETAs but the efficiency of those conversions might be better — lower cost per conversion, higher margin and/or lifetime value — and come from impressions your ETAs weren’t eligible for. But measuring this is far from straightforward because the reporting on RSAs is limited and there’s no way to easily tie a query to an ad much less an RSA combination,” wrote Ginny Marvin for Search Engine Land around this same time last year.
Your final chance for industry-wide recognition, awesome social buzz, and a hefty boost to company morale is fast approaching: The 2021 Search Engine Awards entry period closes at the end of this week — Friday, September 3 at 11:59pm PT to be exact!
There’s still time for you to submit your application — especially because the submission process is faster and easier than ever. But before you begin that entry, take a moment to meet the judges, a fabulous panel of experts across the SEO and PPC spectrum…
… and look over their first-hand advice for what makes a winning SEO or PPC campaign, and thus, a winning Search Engine Land Awards application.
Azeem Ahmad
For me, a successful campaign starts at the end. Yes, you read that right. What I meant by that statement is that for any campaign to be a success, you have to absolutely decide what the measure of success is, and what success looks like for you. Jumping in without clarity often leads to more harm than good for the business.
It’s ok to just say “we want more traffic”, but the great campaigns go a step further and say “with this campaign, we’re looking to increase leads from [x] to [y] – or a [%] YoY growth – by implementing these strategies.”
Crystal Carter
What makes difference in for a good SEO campaign is a clear understanding of the target audience – what you have to offer and what is most genuinely of value to those users. SEO provides a wealth of user data and insights to underpin content, advertising, promotion, and technical improvements. Putting that targeting to good use makes all the difference.
Emily Mixon
My tip for a successful PPC campaign is negative fencing, particularly when you’re in a niche product category, and with all the recent match-type updates and close variants. Crucial to maintaining, and especially improving efficiency, campaigns need to be structured in a way to maintain control of keyword targeting and exclusions in an environment where automation is taking over.
For starters, begin with a phrase match-only campaign with a select few keywords, which you know will have a breadth of search queries matching back to them. For example, if you’re selling kitchen appliances, you could target “ovens” and “ranges”, and then collect the data around all of the search queries matching to them over a period of time. It is key to set a cadence for this, and stick to it, or the amount of search queries to sift through could become unmanageable. My cadence, on a good week, is every Monday, pulling the last 7 days of data.
Next, identify the search queries that are driving significant conversion volume, and add them as exact match keywords into an exact match-only version of the campaign. Do not add them as phrase matches to your phrase match campaign. The point here is to funnel traffic to exact match keywords, for which you can tailor specific and relevant ads and landing page experiences, (ex. [double oven wall unit] or [stainless steel gas ranges]). The phrase match keywords have done their job at this point in identifying converting search queries.
Finally, and most importantly, add the new exact match keywords from your exact match campaign as negative exact match keywords in the phrase match campaign. This helps to ensure that those queries hit the exact match keyword, which should have a better Quality Score and CPC’s than phrase match, and frees up budget for the phrase match keywords to find more search query variations.
Also, while combing through the SQR, be sure to add irrelevant search queries such as, “driving ranges” or “dutch ovens” in this example, as negative keywords to the campaign to help reduce wasted spend.
Anu Adegbola
Regarding what I think makes a successful campaign… a few things straight away come to mind: A good structure based on either site structure, performance of keywords, match types (Broad vs. Phrase vs. Exact). Clear indication of testing being done (ad copy, bid strategies, match types, etc). Robust use of different bid strategies (but according to performance not just a one size fits all solution.
Varied use of ad copy types – expanded text ads, alongside response search ads, dynamics ads, as well as using extra functions like countdowns, IF statements etc. Indication of regular search query analysis being done. Use of automation as needed.
Brett Bodofsky
What makes an effective PPC campaign (specifically Google Ads and Microsoft Ads) is a robust structure which allows for ads that are highly tailored to their targeting. The targeting and copy/creative work together, in unison, to help achieve a clearly defined business objective (obtaining leads, sales, customer retention, creating brand awareness). Landing page also plays an important role in the success of a PPC campaign.
A campaign with refined targeting, bids and creative can drive highly qualified traffic to a page. If the page that traffic is sent to is not what a user would expect, has poor functionality, looks unprofessional, that can severely damage the effectiveness of a PPC campaign.
One factor that gets set up outside of campaigns similar to landing pages that can impact effectiveness is the conversion action. For intelligent bidding decisions and iterations to be made campaigns need a proper conversion action to work off of.
Another factor that can make a PPC campaign effective is time. It sounds so simple, but time allows for data to come in, which helps the machine learning make more informed decisions.
It is the sum of many aspects which create an effective campaign (settings, targeting, bids, budget, creative, messaging, proper tracking, experimentation, demand, landing page, competition, etc). One of these aspects being off in some way can damage the integrity of your campaign and make it less effective.
A campaign might not always be effective as soon as you launch or even in the first week or 2, but in due time, with more data in the door, it could become effective through means of machine learning having more to optimize off of and any manual optimizations/iterations the practitioner makes.
Remember: The final deadline is this Friday, September 3 at 11:59pm PT. Standard entrance fees are $595 per application — and you can submit to as many categories as you like.
Search Engine Land’s daily brief features daily insights, news, tips, and essential bits of wisdom for today’s search marketer. If you would like to read this before the rest of the internet does, sign up here to get it delivered to your inbox daily.
Good morning, Marketers, and how do you use cross-organizational data?
Sharing data between SEO and PPC has probably been one of the easier silos for search marketers to break. It’s pretty simple to go check how your paid search advertising is going and use that data to improve your SEO (check out how Lily Ray did it here). And PPC experts can do the same of SEO data.
But one thing I rarely hear search marketers talk about is data-sharing across the organization. Once a company I worked with told me about how they used their call recordings to improve their SEO and PPC. Not only were they adding in the FAQs to their site that came in over calls, but they figured out where their “leak” in the funnel was.
The company was driving a ton of traffic, but none of it was converting. Not understanding why, they went to their call recordings and quickly found that… no one was answering the phone at certain locations! It’s easy to think that maybe we should tweak a top-of-funnel campaign when the bottom-of-funnel numbers are suffering — but sharing data might show that there’s a step missing in between.
Carolyn Lyden,
Director of Search Content
Four tools to check for title changes in the SERPs
On August 24, Google confirmed that it changed how it creates titles for search result listings. Unfortunately, title change information isn’t available in Google Search Console or Google Analytics. So, SEOs have turned to third-party tools to see whether their titles are being changed. Below is a list of tools you can use to check for title changes and instructions on how to do so.
Ahrefs: Viewing title changes in Ahrefs is a manual process. You can check for changes via historical SERPs in Site Explorer > Organic Keywords 2.0. Simply toggle the date field to view the SERPs for that particular day.
Rank Ranger: The SEO Monitor tool in Rank Ranger charts rankings over time. Below the chart is a list of all the changes to the page title and description in Google Search. This means if you or Google make any changes to your title or description, it’ll be displayed here with the date that the change occurred.
Semrush: Like Ahrefs, Semrush also offers a manual process to check for title changes. For keywords you’ve been tracking in the Position Tracking tool, click on the SERP icon next to the keyword. That will pull the search results page for the date selected in the report. If you suspect a title was changed, you can confirm this by changing the date in the report and repeating this process to compare titles.
SISTRIX: In the left-hand navigation, under SERPs > SERP-Snippets, select “Show title changes.” There, in the “Title” column, you can view title changes. The red text indicates words that have been dropped from the title and the green text indicates words that have been added.
So we’re not surprised to see news about Google paying billions to beat out Bing as the search engine default on Safari.
Google pays $15 billion to be Apple’s default search engine. Google paying Apple to be its default search engine on Safari is also nothing new. In 2018, Google paid Apple $9 billion to be the go-to place that users can find what they need on iPhones, iPads, and Macs. That price has only increased year-over-year, with the latest report. “The amount is likely to increase to about $20 billion in 2022. Those estimates are based on patterns found in the latest available financial documents from both companies,” wrote Florence Ion for Gizmodo.
The most interesting part is many analysts see this as a fee to essentially prevent Bing from being the default search engine. Jane Horvath, Apple’s senior director of global privacy, told the Computers, Privacy and Data Protection conference (CPDP) that Safari defaults to Google because it’s the most popular search engine, but that users still have the ability to change to the search engine of their choice: “We do support Google but we also have built-in support for DuckDuckGo, and we recently also rolled out support for Ecosia.”
Why we care. This move has many marketers asking, “So when will Apple launch their own search engine?” The company has been propelled mostly by hardware, and this deal with ever-increasing payments from Google is one of its moves toward improving its services options. But perhaps the deal with Google is more lucrative than the potential of having to compete with the biggest player in the marketplace for some advertising dollars.
How Google and Yelp handle fake reviews and policy violations: A side-by-side analysis of each
Both Google and Yelp have implemented automated systems as their first line of defense against fake reviews and bad actors. And, they both use human moderators for tasks that the technology isn’t suitable for. However, their respective policies, approaches and punitive measures (a few of which are outlined below), which inform the deployment of their technology and human staff, are the most important distinctions to keep in mind as you establish your online presence.
Both platforms can remove illegitimate reviews: On Yelp, every user-submitted report is escalated to its human moderators. Google embraces a preventative, machine learning-first approach, but experts have said that “the success rate is very tiny” when it comes to getting Google to remove fake reviews once they’re live.
Yelp applies ranking penalties; Google declined to comment: When a business violates one of its policies, Yelp may apply a ranking penalty.
Yelp may remove business listings; Google may revoke profile ownership: Violators may be removed from Yelp’s platform. Google stopped shy of outright removal, instead it may revoke profile ownership.
Violators can still advertise on Google, but not on Yelp: Businesses are banned from advertising with Yelp for at least one year if they receive a Compensated Activity Alert or a Suspicious Review Activity Alert (including if Yelp finds evidence that they participated in a review ring). Google has no specific ad penalties related to GMB violations.
Yelp continues to monitor listings; Google doesn’t seem to: While Google didn’t disclose any details, Yelp went on record, stating that it has a system that monitors for repeated violations.
Want to know if Google’s changed your title tags? This bookmarklet checks if the title tag in SERPs matches the one you set on your site.
Could Google’s “Privacy Budget” break the internet? That’s what Kate Kaye asked on Digiday yesterday about Google’s attempt to prevent fingerprinting.
Would advertisers (and users) switch to a different search engine? If so, here are some non-Google options from contributor John Smith.
What We’re Reading: Despite women’s progress in many parts of society, advertisements still consistently cast women as secondary
“Between 1980 and 2010, women in commercials were shown in workplace settings only 4 percent of the time; frequently they were shown in kitchens, waxing poetic about the products they were selling,” wrote Mara Altman for the NYT. Surely, marketing and advertising have come a long way since then, right? … right?
Sadly, research from Jane Cunningham and Philippa Roberts shows that it really hasn’t, and they say it has to do mostly with who fills the high-level roles at advertising and marketing agencies.
“The Geena Davis Institute on Gender in Media found that ads up for awards at the prestigious Cannes Lions advertising festival depicted male characters working almost twice as often as female characters. Male characters also outnumbered female characters two-to-one and had twice as much screen time and speaking time. Another study conducted by Ebiquity, a media consultancy, found that, of the ads aired in 2016, only 4 percent showed women in leadership positions,” said Altman.
The issue is that marketing isn’t just reflecting what’s happening in real life, but it’s affecting women’s real life, too. “There is a really big body of work around the impact of marketing and just how powerful it is — young women are consuming something like 10,000 messages a day from brands. Think about the collective impact that can have when the same things are being said over and over again, which are usually: Be thinner, be blonder, be more feminine, be hairless, be whiter,” said Cunningham in the interview.
So how can women improve these marketing messages? “The way that women can influence marketing is spending with the brands that are doing the right thing by women and refusing to buy from brands that are very evidently trying to keep women in their place, and/or the place they think women should be,” said Cunningham.
Last week, Google had one of its typical reporting issues where we thought maybe the data was delayed due to some pipeline bug. But it turns out the data that was lost, is likely gone forever and won’t be backfilled in the Search Console performance report.
The statement. Google posted a statement saying that between August 23rd and August 24th “an internal problem caused a data loss in Search and Discover performance during this period. Users might see a significant data drop in their performance reports during this period. This does not reflect any drop in clicks or impressions for your site, only missing data in Search Console.”
Lost data. John Mueller of Google said this means that “it looks like this is really data loss and won’t be back-filled.”
Performance report. The Google Search Console performance report is a report many marketers rely on to see how much visibility and traffic Google Search sends a site. It shows you impressions, clicks, average position in search results; click through rate; and any special features (such as rich results) associated with your results. Learn more about this report over here.
Example of data loss. There are plenty of examples on Twitter of shocking graphs showing huge declines in clicks and impressions in the performance report. But you are not alone, many are noticing this as well. The issue is, if the data looks normal to you, I would not assume all the data is there. You are probably missing data and I would think it is safe to assume you had a better performing day than what Search Console is telling you on August 23rd and 24th.
Here is one tweet showing a decline but there are tons:
Why we care. It is important that you annotate this data glitch in your own reporting or client reporting. Google has added an annotation to the Search Console reports but do not forget, that data on August 23rd and 24th is likely gone forever. I would think it is safe to say you had more impressions and clicks on those days than what Google is showing you. But make sure to communicate the data issue with your clients when you do your monthly reporting.
On Saturday, Google Search Advocate John Mueller clarified that the changes the company has been making to titles in the search results have no effect on rankings.
“This just changes the displayed titles, it doesn’t change ranking or takes [sic] anything different into account,” he elaborated.
Why we care
This means the titles you wrote may still be taken into account by Google when it ranks results. So, don’t stop optimizing your titles just because they may change in the search results. If your clickthrough rates have taken a hit, you can use a third-party tool to check whether Google changed your titles and if it did, it may be worthwhile to experiment with a new title. If, on the other hand, your clickthrough rates have improved, you might want to make note of how Google changed your title for future reference.
That having been said, Google makes ranking changes very often, so you may experience rankings fluctuations, but they won’t be due to title changes.
On August 24, Google confirmed that it changed how it creates titles for search result listings. The confirmation came roughly a week after search professionals began noticing such changes — in the interim (and even after the confirmation), SEOs raised concerns about how these Google-altered titles may affect their traffic.
Unfortunately, title change information isn’t available in Google Search Console or Google Analytics. So, SEOs have turned to third-party tools to see whether their titles are being changed. Below is a list of tools you can use to check for title changes and instructions on how to do so.
Ahrefs. Title changes can be checked in Ahrefs, although it is a manual process. You can check for changes via historical SERPs in Site Explorer > Organic Keywords 2.0.
Since this method shows a list of search results for a given keyword, toggling the “Target only” switch (as shown below), which only shows the snippet from your site, can help you get to the information you’re looking for a bit faster. You can then compare titles by changing dates.
Rank Ranger. The SEO Monitor tool from Rank Ranger is designed to monitor URLs and show you how they perform in Google Search, based on historical data. The data is displayed in a graph that shows ranking changes over time (shown below).
Below the chart is a list of all the changes to the page title and description in Google Search. This means if you or Google make any changes to your title or description, it’ll be displayed here with the date that the change occurred.
Semrush. It is possible to track title changes using Semrush, although the toolset provider does not have a specific feature to do so. For keywords you’ve been tracking in the Position Tracking tool, click on the SERP icon next to the keyword.
That will pull the search results page for the date selected in the report, as shown below.
If you suspect a title was changed, you can confirm this by changing the date in the report and repeating this process to compare titles. Note: you can only view this information for the period you were tracking those particular keywords.
SISTRIX. In the left-hand navigation, under SERPs > SERP-Snippets, there is a button to “Show title changes,” which takes you to this screen:
The red text indicates words that have been dropped from the title and the green text indicates words that have been added.
Other tool providers. We also reached out to a number of other toolset providers. Screamingfrog and Sitebulb do not support this functionality. And, Moz and STAT did not immediately respond to our inquiries.
Why we care. Knowing when your titles are getting changed, and what they’re getting changed to, can be useful for analyzing any correlation the changes may have on your clickthrough rate. Together, these details may help you decide whether to adjust your titles, or if you’re seeing positive changes, they can also tell you what may be resonating with your audience.
Keyword research is one of the most fundamental practices of SEO. It provides valuable insight into your target audience’s questions and helps inform your content and marketing strategies.
For that reason, a well-orchestrated keyword research strategy can set you up for success. Are you looking to improve the way you’re conducting keyword research?
Join experts from Conductor as they deliver a crash course on keyword research tips, tricks, and best practices.
Review platforms, like Google and Yelp, enable local businesses to expand their online visibility and establish credibility through customer reviews — two important aspects of marketing that SMBs may otherwise struggle with.
Over the last few years, maintaining an accurate online presence has gone from being an important marketing tool to being a lifeline for local businesses. In fact, platforms like Google and Yelp churned out a slew of new features last year in response to the coronavirus pandemic, enabling businesses to rapidly communicate business hours or service changes to their customers.
Unfortunately, bad actors may seek to harm a business’s online reputation through fake reviews or by crowding them out with fake listings. While Yelp and Google both have extensive systems and policies to fight bad actors, there are important distinctions that every local marketer should be aware of, and knowing them can help frame your expectations for each platform as well as enable you to make more informed decisions about where to spend your time and resources.
In addition to a side-by-side analysis of Google and Yelp’s respective detection systems and ramifications for violators, we’ve also featured insights from experienced local search marketing experts on the efficacy of each platform; you can read their insights towards the end of this article.
How Google and Yelp detect fake reviews and listings
Both Google and Yelp have implemented automated systems as their first line of defense against fake reviews and bad actors. And, they both use human moderators for tasks that the technology isn’t suitable for. However, their respective policies and approaches, which inform the deployment of their technology and human staff, are the most important distinctions to keep in mind as you establish your online presence.
Google’s approach seems to emphasize prevention at scale via machine learning algorithms that help to tackle fake reviews and listings. Yelp focuses heavily on the integrity of its reviews and seems to have more robust punitive measures in place for violators.
Google’s automated detection system. Google’s automated systems “use hundreds of cues to detect abusive behavior, such as a shift of review patterns on a business and implausible behavior patterns by reviewers,” a company spokesperson told Search Engine Land. Typical user pattern data (for example, users tend to leave reviews, ratings and photos at places they’ve already been) is one of Google’s most important resources when it comes to identifying illegitimate review content and implementing solutions to combat them.
Google’s machine learning-first approach, which has also become a prominent aspect of its paid and organic search systems, is designed to prevent policy-violating content at scale. “For example, we have focused efforts on detecting content coming from click farms where fake reviews and ratings are being generated,” the company said in a February 2021 blog post, “Through better detection of click farm activity we are making it harder to post fake content cheaply, which ultimately makes it harder for a click farm to sell reviews and make money.”
Machine learning models are also used in the Google My Business verification process to catch fake profiles before they appear on Maps. Ideally, Google’s systems will remove the policy-violating content or flag it for further review, along with the associated user account, before the content gets in front of users. But, some fake reviews and profiles are bound to slip through the cracks, as they do with all platforms, which is why the company also deploys thousands of human analysts.
Google’s human content moderators. Teams of human operators and analysts complement Google’s automated detection systems. These analysts help with content evaluations that algorithms may not be able to analyze, such as understanding local slang within a review.
The company has yet to disclose more information about the role of its human analysts, such as whether they help to improve Google’s systems (the way search quality raters do), stating that, “Staying a step ahead of scammers is a constant battle, so we don’t share specific details about our processes.”
Yelp’s automated detection system. Yelp’s automated recommendation software analyzes data points from all reviews, reviewers and businesses in order to recommend reviews to users, but it also looks for solicited reviews and unfairly biased reviews (like reviews that may be written about a competitor or someone’s own business). This software takes the relevance of the review and the reliability of the reviewer (how often the user is active) into account as well.
Yelp’s human content moderators. Yelp has been relatively open about how it uses both technology and its human content moderators to combat policy-violating content: “When a community member, a business owner or our automated system alerts our team about potential issues, a real human reviews the issue every single time,” Noorie Malik, Yelp’s VP of user operations, told Search Engine Land.
Yelp’s user operations team investigates fraudulent activity, validates new businesses when they sign up with Yelp and works to identify activity that might warrant a Consumer Alert.
Consumer Alerts. When Yelp detects abnormal activity on a business profile, which may be an attempt to manipulate a business’s reviews or ratings, it conducts an investigation that may lead to the application of one of its Consumer Alerts.
These notices appear as a pop-up over the business’s review section and may contain a link to any evidence Yelp has gathered. The platform may also temporarily disable the ability to post reviews when it applies a Consumer Alert. There are currently six types of Consumer Alerts:
Compensated Activity Alert: This may be applied when Yelp has evidence that someone has offered an incentive (such as a discount) in exchange for a review.
Public Attention Alert: This alert was created in response to the rise of social activism surrounding the Black Lives Matter movement. When someone associated with a business is accused of, or the target of, racist behavior, Yelp applies this alert to warn users that the business may be experiencing a spike in reviews due to the increased public attention.
Questionable Legal Threats Alert: Yelp applies this alert when it has evidence that a business is abusing the legal system to intimidate a reviewer.
Racist Behavior Alert: Yelp applies this alert when a business attracts media attention over the use of racist symbols, slurs or other acts of racism.
Suspicious Review Activity Alert: This may be applied when Yelp’s systems detect questionable review activity, such as when a large quantity of reviews originates from a single IP address.
Unusual Activity Alert: Sudden media attention may cause an unusual spike in activity on a business profile — for example, instead of basing their review on firsthand experience, users might leave reviews as a form of social commentary. In such cases, Yelp applies this alert and temporarily disables content until activity returns to normal and its moderators clean up the page.
Consequences for violating Google and Yelp’s content policies
If you get caught, running afoul of Google and Yelp’s respective content policies can result in a range of consequences for your business.
Both platforms can remove illegitimate reviews. “Reviews are automatically processed to detect inappropriate content like fake reviews and spam,” Google states in its prohibited and restricted content page, “We may take down reviews that are flagged in order to comply with Google policies or legal obligations.”
In fact, in 2020 alone, Google removed 55 million policy-violating reviews and almost three million fake business profiles. However, these figures, which the company publishes annually, tell an incomplete story because it does not disclose the total number of reviews submitted, active business profiles, reviews and profiles flagged by users and so on.
And, as some of the local search professionals who spoke to Search Engine Land for this article have highlighted in the section below, “the success rate is very tiny” when it comes to getting Google to remove fake reviews once they’re live.
On Yelp, “Reviews that the software determines to be less reliable are moved to a separate ‘not currently recommended’ section of a business’s Yelp page and are not factored into the business’s overall Yelp rating,” said Sudheer Someshwara, Yelp’s head of trust and safety product. The “not currently recommended” section is still accessible to users via the link below the recommended reviews, and reviews may move back and forth between the two sections over time, as the recommendation software continues to learn and evaluate signals.
When content is flagged on Yelp, either by its technology or users, the platform’s team of human moderators manually investigates the complaint, which may result in the removal of fake or purchased reviews, whether those reviews are “recommended” or not.
Yelp applies ranking penalties; Google declined to comment. When businesses violate Google’s policies, the company removes the misleading content and additional penalties may be applied, depending on the specific case. Google declined to comment when asked whether it specifically applies ranking penalties to businesses that violate its policies.
Yelp was more transparent about ranking penalties: “When we find evidence of extreme attempts to manipulate a business’s reputation and inflate their search ranking, we may issue a search ranking penalty,” Someshwara said. These penalties are lifted once the offending behavior has stopped. In particular, search ranking penalties can be applied against businesses that solicit reviews; “If we find indicators of systematic review solicitation, we will apply a search ranking penalty to affected Yelp business pages,” the company said in a support center post.
Yelp may remove business listings; Google may revoke profile ownership. Yelp reserves the right to remove from its platform businesses that seek to artificially manipulate its systems or mislead users.
“If we determine that a business is buying fake reviews or violating any other Google My Business policies, we take swift action ranging from removing content to account suspension and revoking Business Profile ownership,” a Google spokesperson told Search Engine Land.
Violators can still advertise on Google, but not on Yelp. In addition to managing their business profiles and organic presence on Google and Yelp, business owners can take advantage of each platform’s paid products to boost their visibility.
“Businesses are banned from advertising with Yelp for at least one year if they receive a Compensated Activity Alert or a Suspicious Review Activity Alert (including if our team finds evidence that they participated in a review ring),” Someshwara said, adding that these alerts require concrete evidence before they’re applied. Yelp may also ban businesses from advertising if it identifies egregious attempts to manipulate a business’s search ranking or star rating.
This is one area where Yelp and Google differ dramatically, as Google currently has no advertising penalties for businesses that violate policies on the organic side. All of Google’s standard ads policies still apply.
Yelp continues to monitor listings; Google doesn’t seem to. “Yelp has a system in place that monitors and detects if repeated violations occur,” Someshwara said, adding that it also relies on its base of users to report violations.
Google did not disclose details about if or how it continues to monitor business listings that have a history of policy violations. However, the company did emphasize that “we closely monitor 24/7 for fraudulent content, using a combination of people and technology.”
Google and Yelp, through the eyes of practitioners
A business’s experience with a platform often plays out differently in real life than it does when they’re learning about the platform. The three local marketers who spoke to us for this article have extensive experience with both Google and Yelp and their insights can help to frame your expectations when using those platforms.
“Room for improvement in . . . both platforms.” “I think the strengths Google and Yelp have as local review platforms is their reach and their authority, so to speak,” said Niki Mosier, head of SEO at AgentSync, who spoke more broadly about the pros and cons of these platforms. “I posted a review for a national park less than a week ago with a photo and got a notification yesterday that over 2,000 people have seen that photo,” she provided as an example.
Mosier pointed to the difficulty involved with removing policy-violating reviews as a weak point for both Google and Yelp: “Slanderous or inaccurate reviews can be very harmful to a business,” she said, “I know it’s a slippery slope with letting people get reviews removed but I think there is definitely room for improvement in that area on both platforms.”
Google. Ben Fisher, co-founder of Steady Demand and a Google My Business platinum product expert, and Joy Hawkins, owner of the Local Search Forum, Local U and Sterling Sky, spoke to the respective strengths and weaknesses they’ve experienced across Google and Yelp.
“Google powers local search, and while [reviews] are a conversion factor, they can also be a ranking factor,” Fisher said. People may decide to visit a local business after reading reviews, and reviews are also a ranking factor for Google. “This is a powerful [benefit],” he added.
“[Google’s] strength would be that it is easier for business owners to collect a high volume of reviews without worrying about them all getting filtered (like Yelp),” said Hawkins.
Fisher and Hawkins both singled out fake reviews as a problem for the search engine: “The most negative con is that reviews can be gamed, they can be bought by competitors and in the worst-case scenario, they can be weaponized in an attack,” Fisher said, “A successful negative attack can easily wipe out your star rating and bring down the ranking of GMB and any Local Service Ads you have connected to GMB.” Google does have workflows for handling negative reviews, “but the success rate is very tiny,” he added.
“For Google, their biggest weakness is combating fake reviews,” Hawkins said, “They are beyond terrible at it and even the most obvious cases that get reported by a human (not automatically caught) get missed (Google deems the reviews fine even though they’re fake).”
Yelp. “Yelp’s strength is definitely that they combat fake reviews (and filter them) better than any other platform, in my experience,” Hawkins said. Fisher shared a similar opinion: “Yelp has a better way of handling reviews and I hear clients that get very good removal rates; additionally if there are a lot of fake reviews Yelp may put up a warning to consumers.”
Yelp’s more stringent reviews policy seems to be a double-edged sword for marketers: “The negative is that they also filter out a ton of legitimate reviews which makes the overall ratings for many businesses appear to be much lower than what Google shows,” Hawkins said, “Their no-soliciting policy makes it really hard for businesses to combat negative reviews,” she added.
“This is probably the most frustrating part: a user must have a trusted account to leave a review and many users simply do not fall into this criteria,” Fisher said with regards to Yelp’s recommended reviews, “Therefore, a review on Yelp could be a lost opportunity.”
You don’t get to pick and choose
Don’t put all your eggs in one basket. As marketers, we must meet our customers wherever they are, and in this case, that means growing your presence on both Google and Yelp. Knowing the policy and capability differences of each platform can help you invest your time and resources more wisely, but ultimately, each site should serve to bolster your business’s overall reputation. This can be especially important if you ever become the victim of a negative review attack on one platform, as your presence on other platforms can continue to bring in customers.
Search Engine Land’s daily brief features daily insights, news, tips, and essential bits of wisdom for today’s search marketer. If you would like to read this before the rest of the internet does, sign up here to get it delivered to your inbox daily.
Good morning, Marketers, and we need to work on our communication.
I just hopped off a call with the SMX SEO committee and one takeaway I got from the lively conversation we had is that search marketers don’t always get communication right. Whether it’s in the form of setting expectations for clients and stakeholders from the beginning or reporting what’s going on in our campaigns and accounts.
We all know the key to solid communication is knowing your audience and knowing what matters most to them. Your CEO cares about top-level metrics. Your tech team cares about the details and specifications. And your associate cares about how they can best do their jobs.
This conversation had me nodding my head and thinking of PPC expert Amanda Farley’s SMX Advanced session on approaching your audience in a whole new way. And how to craft your communication to actually answer what matters most to each person involved.
Sure, it’s meant for your PPC campaigns, but we could probably learn a thing or two from it about communicating with stakeholders, clients, team members and our bosses, too.
Carolyn Lyden,
Director of Search Content
The latest data behind the title tag changes in Google
BowTiedWookie (yep) posted a thread on Twitter after analyzing “10 sites for the same 500 keywords” with the Keywords in Sheets tool. Sure, it’s a small sample size, but the data is worth examining (and seeing if the same is true for your sites). What trends did they find?
The shorter the title the less likely Google is to change it.
Google will change the title ~95% of the time if emojis or weird characters are included.
High DA sites are not being forced to include the brand name.
If Google changes the title it is pulling in the H1 >50% of the time.
The average character count of Google’s title changes were 52 characters.
Calling all future-focused search marketers, submit a pitch for SMX Next!
SMX Next returns virtually on November 9-10, 2021 focusing on forward-thinking search marketing. AI and machine learning have already become part of both paid and organic search performance. Commerce platforms are just as powerful as the traditional search engines for driving sales. And new ways to deliver content across search and social platforms are giving creative marketers more options for driving engagement.
SMX Next will explore next-generation strategies, equipping attendees with emerging SEO and PPC tactics as well as expert insights on the future of the search marketing profession. Whether you’ve been speaking for years or are just dipping your toes into speaking, please consider submitting a session pitch. We are always looking for new speakers with diverse points of view. The deadline for SMX Next pitches is September 24th!
10 powerful reasons to enter the Search Engine Land Awards
The global events of the past two years have made it more important than ever for brands and agencies to stand out against the competition. Being able to call your company “award-winning” is one of the most powerful differentiators you can have. That’s just one amazing reason to enter the 2021 Search Engine Land Awards. Need more?
Deadline alert: The final date to submit your Search Engine Land Awards entry is Friday, September 3 at 11:59 pm PST. Begin your application now!
Search Shorts: Local, local, local
How to handle local SEO without a physical address. Yes, your business can rank in search results in cities where you don’t have a physical address. How? By using tools like Google My Business (GMB), by creating content related to the city you want to target, by adding reviews or testimonials from clients in the city you’re targeting, and more.
Google Local Pack without any CTAs. “If you search for [restaurants near me] in Google Search mobile or desktop, you will see the local three pack and map but you won’t see buttons to call the restaurant or directions or a way to order online,” wrote Barry Schwartz on SERoundtable. We’re with Lily Ray on this one: “How is this helpful for users?”
Your Google Posts might start appearing on third-party sites. “Which third party sites would posts appear on?” asked Claire Carlile in a tweet last week. We’re interested to see which types of sites Posts will appear on.
Quote of the Day
“I might be in the minority here, but I think Google has the right to change title tags just like they have the right to change meta descriptions if they think it leads to better experiences for THEIR users,” tweeted Eli Schwartz, author of “Product-Led SEO.”
The SEO industry has a large problem: an overwhelming lack of resources and available talent.
This becomes more apparent at the enterprise level, where data sets are vast, dense, and complex, making it difficult not only to make sense of the data but to act on it.
Not to mention the numerous SEO tools are advanced and great at pointing out issues to SEOs, but lack the ability to actually “do SEO.”
Automating SEO allows marketers to reduce the time from data to insights to action, so they can implement faster and see results.
The number one challenge we hear from enterprise SEOs is getting things done.
In fact, a study by Moz in 2016 found that the majority of SEOs didn’t see their recommendations implemented for “close to 6 months after they were requested.”
Misaligned priorities can make it difficult to implement even the smallest of changes — a simple tweak to the metadata can take months to deploy to production!
Ramesh Singh, Head of SEO at Great Learning, echoes this point in response to the question, “What SEO function/task do you most wish you could do faster and at bigger scale than you can now?”
All the while, Google continuously becomes more intelligent, using machine learning and AI to serve users the most relevant information in the most effective way. The SERPs are more complex than ever, with Google entities becoming the direct competition.
There’s too much data to act on for any human agent alone. Automation, however, works around the clock and can generate insights that would otherwise take hours to reach.
SEOs naturally face human limitations in their ability to sort through data and act on insights at the level that’s required today — AI and machine learning can fill the gaps and accelerate execution.
In order to reap the benefits of automation, it’s important to start with the basics before moving on to automated insights, and lastly, automated execution.
Automation has changed and evolved over the years, so let’s break down what SEO automation looked like in the past compared to its current state.
#1. Beginner: Data collection
Beginner-level automation starts with pulling together data sets and learning how to drive the car, so to speak.
Data aggregation and collection is an important step in your maturity in SEO. If you are still manually downloading and aggregating data in an attempt to draw insights, there are easier ways to do this.
It includes things like:
Rank tracking
Reporting and dashboards
Backlinks
Site crawl data
If you are new to automation, you need to start here! Automation compounds itself, so you can’t benefit from it unless you start with the basics.
The good news is that data collection and aggregation is a commodity. Hundreds of SEO tools offer this for a fraction of the cost it once was.
Enterprise organizations, however, have extra considerations to be mindful of security, stability, reliability, scale, and SLAs should all be taken into account when working with automation tools.
An SEO platform like seoClarity consolidates data from rankings, links, and site crawl data for your site and any competitor and automates reporting at scale with enterprise security.
I’ve done the basics. Tell me what’s new in SEO Automation.
#2. Intermediate: Automated insights
As automation progresses, SEOs can prioritize their time where it counts and let the technology surface site-specific, actionable insights for them.
This begins with alerts based on AI: receive a notification if the AI detects a major rank change or if a critical page element is modified or deleted.
In this case, automation works as a member of your team, continuously monitoring thousands of pages, so you don’t have to.
Once alerts are automated, SEOs can turn their attention to the advanced and custom insights that are delivered from AI. These insights help to identify opportunities and issues so you can prioritize strategy and execution.
Some examples of this include recommendations and actions within:
Technical audits
Content analysis
Log-file analysis
Backlink analysis (identify link partners, quality backlinks, toxic links)
Keyword research (analyze competitors, identify keyword patterns and topics)
Page and rank change alerts
Marketers need to understand that the sophistication of automation varies greatly in what SEO software can provide.
Advanced AI and machine learning are required to surface and customize those insights specifically based on your site’s ranking, crawl, and link data. Not just generic and basic on-page SEO that most tools provide.
Even Lily Ray, a respected SEO consultant, and influencer, says here:
Actionable Insights from seoClarity analyzes data in real-time to reveal custom, site-specific opportunities and issues for content, page speed, indexability, schema, internal links, and more.
These insights are based on your site’s ranking, crawl, and link data, so the technology actually analyzes your site as if it were a member of your in-house SEO team.
Marketers need insights that are personalized to their site and data — if the cookie-cutter approach worked, everyone would rank on page one.
#3. Advanced: Scaling execution
The next step in SEO automation is execution, implementation, and testing.
Everyone agrees the biggest challenge in enterprise SEO is the speed of execution and implementation at scale!
Whether it’s waiting months for simple updates to be implemented or working on a forecast to get a project prioritized, it slows down what we KNOW should be implemented all while Google (and the competition) speed ahead.
The bottom line is results don’t happen unless Google can see the changes on your site.
The latest development in automation empowers SEO teams to address the roadblocks that stand in their way and deploy changes to a site in real-time without the need for developers, UX teams, and other stakeholders.
Most companies and SEO teams haven’t implemented this level of automation in SEO … yet.
Why? Maybe it’s not accessible, or it seems too complicated.
Even though it’s the top complaint and roadblock we hear from SEO teams, companies struggle to scale execution.
You can start your journey to scale SEO execution today with seoClarity.
Be among the first to leverage advanced automation in SEO with a machine that continuously updates your site in real-time. Sign up to be a part of the beta launch.
Keyword research is one of the most fundamental practices of SEO. It provides valuable insight into your target audience’s questions and helps inform your content and marketing strategies.
For that reason, a well-orchestrated keyword research strategy can set you up for success. Are you looking to improve the way you’re conducting keyword research?
Join experts from Conductor as they deliver a crash course on keyword research tips, tricks and best practices.
Search Engine Land’s daily brief features daily insights, news, tips, and essential bits of wisdom for today’s search marketer. If you would like to read this before the rest of the internet does, sign up here to get it delivered to your inbox daily.
Good morning, Marketers, and a lot can happen in a year.
This past Sunday we had one-year photos taken to commemorate my daughter hitting that first birthday milestone next week. Getting the photo gallery yesterday morning sent me down memory lane thinking about how much has changed in 365 days. Babies grow and learn at such a rapid pace during their first years. A little potato human that couldn’t lift her head can now walk, communicate, and sleep through the night (mostly, thank goodness).
The same is true for us as search marketers. Think about where you were in your career a year ago. Probably stuck at home trying to weather a pandemic. But in the meantime, you may have started your own business, started a new job, learned new skills, executed a stupendous campaign and more.
As you’re prepping for Q4 of this year, keep that momentum going (or start it back up if you’ve felt stagnant recently). Plan your goals and create a blueprint to execute them. I remember reading a story about someone who wanted to go back to school in their 50s and they were worried that it was too late in life to “start over” and go to a four-year college.
The motivational part was this: Those years will pass by whether you work toward your goals or not. So you might as well get started on them now.
Carolyn Lyden,
Director of Search Content
Ask the expert: Demystifying AI and machine learning in search
The world of AI and machine learning has many layers and can be quite complex to learn. Many terms are out there and unless you have a basic understanding of the landscape it can be quite confusing. In this article, expert Eric Enge will introduce the basic concepts and try to demystify it all for you.
There are so many different terms that it can be hard to sort out what they all mean. So, let’s start with some definitions:
Artificial Intelligence – This refers to intelligence possessed/demonstrated by machines, as opposed to natural intelligence, which is what we see in humans and other animals.
Artificial General Intelligence (AGI) – This is a level of intelligence where machines are able to address any task that a human can. It does not exist yet, but many are striving to create it.
Machine Learning – This is a subset of AI that uses data and iterative testing to learn how to perform specific tasks.
Deep Learning – This is a subset of machine learning that leverages highly complex neural networks to solve more complex machine learning problems.
Natural Language Processing (NLP) – This is the field of AI-focused specifically on processing and understanding language.
Neural Networks – This is one of the more popular types of machine learning algorithms which attempts to model the way that neurons interact in the brain.
Search marketers should remember their power in the Google-SEO relationship
Google has essentially said that SEOs (or those attempting SEO) have not always used page titles how they should be for a while (since 2012). “Title tags can sometimes be very long or ‘stuffed’ with keywords because creators mistakenly think adding a bunch of words will increase the chances that a page will rank better,” according to the Search Central blog. Or, in the opposite case, the title tag hasn’t been optimized at all: “Home pages might simply be called ‘Home’. In other cases, all pages in a site might be called ‘Untitled’ or simply have the name of the site.” And so the change is “designed to produce more readable and accessible titles for pages.”
This title tag system change seems to be another one of those that maybe worked fine in a lab, but is not performing well in the wild. The intention was to help searchers better understand what a page or site is about from the title, but many examples we’ve seen have shown the exact opposite.
The power dynamic is heavily weighted to Google’s side, and they know it. But the key is to remember that we’re not completely powerless in this relationship. Google’s search engine, as a business, relies on us (in both SEO and PPC) participating in its business model.
Search Shorts: YouTube on misinformation, improving ROAS in Shopping and why it’s time to get responsive
YouTube outlines its approach to policing misinformation and the challenges in effective action. “When people now search for news or information, they get results optimized for quality, not for how sensational the content might be,” wrote Neal Mohan, chief product officer at YouTube.
How to improve Google Shopping Ads ROAS with Priority Bidding. “If you feel more comfortable with Search and Display PPC campaigns, manual is a safe bet as you dip your toes into Shopping,” wrote Susie Marino for WordStream.
Forget mobile-first or mobile-only — It’s time to get truly responsive. “If you’re thinking about your website in terms of the desktop site, welcome to the 2010s. If you’re thinking about it mobile-first, welcome to the 2020s. But it’s 2021. It’s time to think about your site the way Good Designers do- it’s time to get responsive,” said Jess Peck in her latest post.
What We’re Reading: Google’s local search trends: From saturation to depth of content and personalization
The focus of GMB has shifted in recent years from getting more businesses to sign up for the listing service to getting business owners or managers to add even more information about their companies on the platform.
“The new GMB mission is to have businesses provide as much relevant information for as many content areas as possible. Beyond basic contact info, these opportunities include photos, action links, secondary hours, attributes, service details, and several other features. The intent is to make GMB as replete with primary data as possible, so that any pertinent detail a consumer might need to know before choosing a local business is provided in-platform, without the need to click through to other sources,” wrote Damian Rollison for StreetFight.
The local trend matches Google’s overall direction in the search engine results pages: answering everything right there in the SERP. It also does this by personalizing the local results to what it believes is the searcher’s intent.
“The term that has arisen to describe the most prevalent type of local pack personalization is ‘justifications’ (this is apparently Google’s internal term for the feature). Justifications are snippets of content presented as part of the local pack — or, in some cases, as part of the larger business profile — in order to ‘justify’ the search result to the user. Justifications pull evidence from some less-visible part of GMB, from Google users, from the business website, or from local inventory feeds, and publish that evidence as part of the search result,” said Rollison.
So why should marketers care about this? “Personalization represents a broad range of opportunities for businesses to drive relevant traffic from search to store. Answers to questions, photos, website content, and much more can be optimized according to the products and services you most want to surface for in search.“
The world of AI and Machine Learning has many layers and can be quite complex to learn. Many terms are out there and unless you have a basic understanding of the landscape it can be quite confusing. In this article, expert Eric Enge will introduce the basic concepts and try to demystify it all for you. This is also the first of a four-part article series to cover many of the more interesting aspects of the AI landscape.
Current Google AI Algorithms: Rankbrain, BERT, MUM, and SMITH
Basic background on AI
There are so many different terms that it can be hard to sort out what they all mean. So let’s start with some definitions:
Artificial Intelligence – This refers to intelligence possessed/demonstrated by machines, as opposed to natural intelligence, which is what we see in humans and other animals.
Artificial General Intelligence (AGI) – This is a level of intelligence where machines are able to address any task that a human can. It does not exist yet, but many are striving to create it.
Machine Learning – This is a subset of AI that uses data and iterative testing to learn how to perform specific tasks.
Deep Learning – This is a subset of machine learning that leverages highly complex neural networks to solve more complex machine learning problems.
Natural Language Processing (NLP) – This is the field of AI-focused specifically on processing and understanding language.
Neural Networks – This is one of the more popular types of machine learning algorithms which attempts to model the way that neurons interact in the brain.
These are all closely related and it’s helpful to see how they all fit together:
In summary, Artificial intelligence encompasses all of these concepts, deep learning is a subset of machine learning, and natural language processing uses a wide range of AI algorithms to better understand language.
Sample illustration of how a neural network works
There are many different types of machine learning algorithms. The most well-known of these are neural network algorithms and to provide you with a little context that’s what I’ll cover next.
Consider the problem of determining the salary for an employee. For example, what do we pay someone with 10 years of experience? To answer that question we can collect some data on what others are being paid and their years of experience, and that might look like this:
With data like this we can easily calculate what this particular employee should get paid by creating a line graph:
For this particular person, it suggests a salary of a little over $90,000 per year. However, we can all quickly recognize that this is not really a sufficient view as we also need to consider the nature of the job and the performance level of the employee. Introducing those two variables will lead us to a data chart more like this one:
It’s a much tougher problem to solve but one that machine learning can do relatively easily. Yet, we’re not really done with adding complexity to the factors that impact salaries, as where you are located also has a large impact. For example, San Francisco Bay Area jobs in technology pay significantly more than the same jobs in many other parts of the country, in large part due to the large differences in the cost of living.
The basic approach that neural networks would use is to guess at the correct equation using the variables (job, years experience, performance level) and calculating the potential salary using that equation and seeing how well it matches our real-world data. This process is how neural networks are tuned and it is referred to as “gradient descent”. The simple English way to explain it would be to call it “successive approximation.”
The original salary data is what a neural network would use as “training data” so that it can know when it has built an algorithm that matches up with real-world experience. Let’s walk through a simple example starting with our original data set with just the years of experience and the salary data.
To keep our example simpler, let’s assume that the neural network that we’ll use for this understands that 0 years of experience equates to $45,000 in salary and that the basic form of the equation should be: Salary = Years of Service * X + $45,000. We need to work out the value of X to come up with the right equation to use. As a first step, the neural network might guess that the value of X is $1,500. In practice, these algorithms make these initial guesses randomly, but this will do for now. Here is what we get when we try a value of $1500:
As we can see from the resulting data, the calculated values are too low. Neural networks are designed to compare the calculated values with the real values and provide that as feedback which can then be used to try a second guess at what the correct answer is. For our illustration, let’s have $3,000 be our next guess as the correct value for X. Here is what we get this time:
As we can see our results have improved, which is good! However, we still need to guess again because we’re not close enough to the right values. So, let’s try a guess of $6000 this time:
Interestingly, we now see that our margin of error has increased slightly, but we’re now too high! Perhaps we need to adjust our equations back down a bit. Let’s try $4500:
Now we see we’re quite close! We can keep trying additional values to see how much more we can improve the results. This brings into play another key value in machine learning which is how precise we want our algorithm to be and when do we stop iterating. But for purposes of our example here we’re close enough and hopefully you have an idea of how all this works.
Our example machine learning exercise had an extremely simple algorithm to build as we only needed to derive an equation in this form: Salary = Years of Service * X + $45,000 (aka y = mx + b). However, if we were trying to calculate a true salary algorithm that takes into all the factors that impact user salaries we would need:
a much larger data set to use as our training data
to build a much more complex algorithm
You can see how machine learning models can rapidly become highly complex. Imagine the complexities when we’re dealing with something on the scale of natural language processing!
Other types of basic machine learning algorithms
The machine learning example shared above is an example of what we call “supervised machine learning.” We call it supervised because we provided a training data set that contained target output values and the algorithm was able to use that to produce an equation that would generate the same (or close to the same) output results. There is also a class of machine learning algorithms that perform “unsupervised machine learning.”
With this class of algorithms, we still provide an input data set but don’t provide examples of the output data. The machine learning algorithms need to review the data and find meaning within the data on their own. This may sound scarily like human intelligence, but no, we’re not quite there yet. Let’s illustrate with two examples of this type of machine learning in the world.
One example of unsupervised machine learning is Google News. Google has the systems to discover articles getting the most traffic from hot new search queries that appear to be driven by new events. But how does it know that all the articles are on the same topic? While it can do traditional relevance matching the way they do in regular search in Google News this is done by algorithms that help them determine similarity between pieces of content.
As shown in the example image above, Google has successfully grouped numerous articles on the passage of the infrastructure bill on August 10th, 2021. As you might expect, each article that is focused on describing the event and the bill itself likely have substantial similarities in content. Recognizing these similarities and identifying articles is also an example of unsupervised machine learning in action.
Another interesting class of machine learning is what we call “recommender systems.” We see this in the real world on e-commerce sites like Amazon, or on movie sites like Netflix. On Amazon, we may see “Frequently Bought Together” underneath a listing on a product page. On other sites, this might be labeled something like “People who bought this also bought this.”
Movie sites like Netflix use similar systems to make movie recommendations to you. These might be based on specified preferences, movies you’ve rated, or your movie selection history. One popular approach to this is to compare the movies you’ve watched and rated highly with movies that have been watched and rated similarly by other users.
For example, if you’ve rated 4 action movies quite highly, and a different user (who we’ll call John) also rates action movies highly, the system might recommend to you other movies that John has watched but that you haven’t. This general approach is what is called “collaborative filtering” and is one of several approaches to building a recommender system.
Note: Thanks to Chris Penn for reviewing this article and providing guidance.