Monday, October 31, 2022

Google introduces Ads Data Hub for Marketers and Measurement Partners

Google’s Ads Data Hub helps advertisers do customized analyses for their business objectives while maintaining user privacy. Because marketers and measurement partners have different needs, Google has created two different solutions: Ads data Hub for Marketers and Ads Data Hub for Measurement Partners.

Ads Data Hub for Marketers

The Ads Data Hub for Marketers is a way for advertisers to analyze their data, get access to insights, and be better informed when they buy media.

Publisher Advertiser Identity Reconciliation. Publisher Advertiser Identity Reconciliation (PAIR) is a solution that gives publishers and advertisers the option to securely and privately reconcile their first-party data for audiences who have visited both an advertiser’s and a publisher’s site.

New query templates, automated workflows, and reporting updates will reduce the time needed to generate insights. With the implementation of PAIR, marketers will be able to activate their audience segments on new inventory, including YouTube.

Ads Data Hub for Measurement Partners

The Ads Data Hub for Measurement Partners gives partners the ability to provide YouTube insights and measurement services on behalf of marketers, advertisers, agencies, or publishers.

The launch of Ads Data Hub for Measurement Partners means that it’ll be easier for partners to offer accurate measurements and deliver real-time insights. For marketers, it means they can work with independent, third-party partners to calculate and report on YouTube ad performance across devices, formats, and metrics.

Additional information

Google offered the following information in addition to the recent announcement on the blog:

The tailored experiences allow us to focus on expanding our current features for marketers who already use ADH to make the most of their first-party data and build audiences in a privacy-centric way, and focus on feature development that brings the most value to our Measurement Partners. 

  • For example, we’ve now reduced enablement for Measurement Partners from more than 10 hours to just 60 seconds for advertisers to share their YouTube data via ADH, maximizing reporting coverage and near-real-time reporting.
  • We also just expanded YouTube for masthead and In-feed Video for Consideration for both Measurement Partners and Marketers, which provides a more comprehensive and holistic look at their YouTube measurement.

When Ads Data Hub was first launched, it supported mobile and desktop, but we’ve since introduced CTV measurement. It is now a holistic, cross-device measurement solution that allows for richer insights in a way that doesn’t sacrifice user privacy. 

Other key features for Ads Data Hub: 

  • MRC Accreditation: For the past two years, Ads Data Hub has received MRC accreditation, underscoring our commitment to creating a solution that meets industry standards and needs. 
  • Cross-media measurement: Through a number of our measurement partners, we have expanded measurement services on Ads Data Hub to enable cross-media solutions for YouTube. Customers are able to analyze the performance of YouTube campaigns relative to other media channels (including linear TV, streaming TV, or online video sources).
  • Co-viewing: As the way consumers watch TV content shifts, it’s become more complex to measure impact. Over the past year, we have introduced co-viewing with some of our reach measurement partners, allowing advertisers to measure when multiple viewers are watching YouTube TV and YouTube on CTV. 

Why we care. Different measurement solutions will help advertisers and partners measure insights for YouTube and CTV campaigns to analyze the performance across multiple platforms, in real-time. The new privacy-centric measurement tools protect the data of users online while still providing advertisers, agencies, and partners with comprehensive analytics.

The post Google introduces Ads Data Hub for Marketers and Measurement Partners appeared first on Search Engine Land.

Google’s 2010 and 2018 page speed algorithms were replaced by core web vitals

In the history of Google page speed ranking algorithms, we had three different announcements. Most recently we had the page experience update from 2020 that uses core web vitals, in 2018 we had the Google page speed update, and then in 2010 we had the Google site speed update.

Google only uses core web vitals. Google’s John Mueller confirmed on Twitter that Google now only uses the most recent update for page speed ranking, specifically the page experience update that measures page speed using core web vitals.

John said “it’s all cwv [core web vitals] now” when asked about if the old page speed algorithms are used.

Old page speed algorithms. John said “no,” to Google using the old 2010 and 2018 page speed algorithms.

Why we care. Truth is, page speed as a ranking factor is super light and does not really make a huge difference for ranking purposes. But either way, it is good to know that Google no longer uses the old page speed algorithms, even though one might consider it obvious for Google not to have two or more competing algorithms.

The post Google’s 2010 and 2018 page speed algorithms were replaced by core web vitals appeared first on Search Engine Land.

3 new tools to measure Google app performance and privacy

Google has just announced three new tools to measure cross-platform app conversions, strengthen performance with privacy-focused measurement, and adapt to iOS and Android standards.

Cross-platform conversion measurement. Without cross-platform conversion tracking, google says advertisers will have measurement gaps and optimization limitations across web and app campaigns.

To solve this issue, and help advertisers set up conversion tracking, Google is testing alpha testing Web to App Connect, a toolkit that guides advertisers through the process step by step.

Deep linking. Google is also prioritizing a frictionless web-to-app experience through deep linking. Deep linking connects the mobile web experience to the company’s app to ensure a smoother, faster, checkout experience.

Web to App Connect will be available in beta early next year.

Privacy-focused measurement. Google Analytics for Firebase, Google’s privacy-forward measurement solution, provides advertisers with cross-platform measurement capabilities to understand how users engage with your app and optimize your App campaign attribution, reach and performance.

Google says when you implement Analytics for Firebase you’ll unlock additional benefits such as audience management and tROAS bidding for app campaigns, and:

  • Maximized performance: Additional consented, signed-in data to automatically improve App campaign ramp up and modeling to more efficiently reach consumers.
  • Expanded inventory: By early next year, advertisers will get access to connected TV and the ability to measure performance across devices.
  • New audience lists: You’ll have the ability to more easily engage existing app users who have not been reached with push notifications.

Adapting to shifts in platform standards on iOS and Android. Google is now supporting solutions that help advertisers navigate changes to rollout of the AppTrackingTransparency (ATT) framework on iOS, including the new version 4.0 updates to SKAdNetwork, Apple’s measurement API.

Next year, Google will also release tools for advertisers using the Google Analytics for Firebase SDK, which streamline setting up SKAdNetwork conversion values and help better optimize their campaigns for in-app actions.

Why we care. Measuring cross-device app conversions, enhancing reporting accuracy, and preserving privacy are essential for app advertisers to accurately optimize and scale campaigns in 2023 and beyond.

Advertisers can start setting up SKAdNetwork conversion values themselves or work with a measurement partner.

The post 3 new tools to measure Google app performance and privacy appeared first on Search Engine Land.

10 spooky SEO tactics to axe in 2023

It was a cold, dark night. At the end of the long path was an old, rundown mansion covered in cobwebs. Inside was a candle flickering next to the shadow of a person. 

You rub the frost off the stained-glass window on the doorstep, and there’s just enough light to see what the man inside is doing. He’s moving his body from side to side, chanting numbers – what could it be? Black magic? 

You put your ear to the window to find out what you can hear – and you can make out … something:

“One, two, three, four paid links!” 

Alas – he’s not a sorcerer at all – but something even more chilling: A marketer performing the ancient practice of black hat SEO! 

Does this tale spook you as much as it does me? It’s almost 2023, and if you’re involved in buying links – or any of the 10 things I’m about to detail in this article – you won’t have a chance to compete in the search results.

So without further ado, here are 10 spooky SEO practices to axe in 2023.

1. Not getting buy-in from the top

Your company may say that you can “do SEO” without actually understanding what SEO means.

Fast-forward to a few months down the road when you need to make big decisions about the site, and management is nowhere to be found. 

To be successful in SEO, you need commitment from the top.

In addition, SEO needs to be thought of even when you, the marketer,  are not in the room. 

Every decision on the website impacts SEO. When you have proper buy-in, you can solve so many other issues featured in this article.  

2. Hiring people that know less about SEO than you

You’ve hired a big, brand-name agency because you heard they were the best. 

The company assigns an SEO professional with a fancy title to your account – what could go wrong? 

Except after only a few weeks, it is apparent that you know more than your SEO team does. 

And, with a little digging, you find that the person servicing your account has only a few years of SEO under their belt.

This is a real phenomenon, folks. 

Make sure you research the people you hire before you sign that contract. Otherwise, your working relationship will not be fruitful.

3. Being sure you have a plan that will stick

“Man plans, Google laughs.”

OK, that isn’t quite how that old expression goes, but the reality is that the only thing constant in SEO is change. 

With Google making thousands of changes to search each year, and your competitors even more, how can you seriously plan for SEO six months from now? 

You do not have any knowledge about the changes you will encounter. So come to terms with the fact that a long-term SEO plan is worthless. 

What to do instead?

Run in four-week sprints and re-evaluate what the website needs after each sprint.

4. Getting SEO advice and not implementing it

If you’ve invested time and money into hiring an SEO only to ignore their recommendations, then don’t be surprised by the results you don’t get. 

I understand. Sometimes it seems like an uphill battle to get things done. 

That is why having buy-in and a plan for how you will implement SEO strategies is the first step before engaging in SEO services.  

5. Ignoring the hard changes

When faced with business silos, competing priorities and a lack of resources, it may seem impossible to get the “hard” changes done to a website.

Sometimes, they are partially or even poorly implemented to try and move the needle.

The hard changes, though, are those changes that can make a fundamental difference to your SEO program.

If the recommendation is to do them, make a case for getting them done and, if you need to, hire outside expertise to do them the right way.

6. Thinking any content is good content

If you go to the heart of almost any Google algorithm update, you will find it centers on quality content. 

To succeed now and in the future, you need helpful content – expert, authoritative and trusted content. 

You must somehow stand out from the competition rather than regurgitate what everyone else is saying. 

Spinning others’ ideas equates to average content. And Google does not reward the average.

7. Thinking all keywords are equal

Inventing keywords does not mean that anyone would search for them. Thus they may get no traffic. 

That is just one point, but you must also consider that there are many keyword strategies, and they vary by industry. 

Matching your content to query intent can help you perform better in search – and is the key to being considered an expert and gaining ranking. 

And if it is not ranked, then the content is nearly worthless.

8. Not looking at PPC data

Unfortunately, PPC data is often ignored. And SEO and PPC teams often feel at odds with one another.

Knowing what converts in PPC is a solid indication of the ROI for each keyword. 

Also, by studying the negative keywords in PPC that identify ambiguous keywords, the SEO sees issues that point to the need for schema. 

Bottom line: If certain keywords have a clear meaning and great conversion, then you may want them in your SEO program. 

By now, we shouldn’t still be having the “paid links” conversation.

Yet many websites still engage in this practice – unknowingly or knowingly. 

To be fair, link buying is not always a black-and-white issue; there are shades of gray. 

For example, if I pay someone to write an article and place it on another site, is that a paid link? Google thinks so

The remedy to paid links? Create things worth linking to and then let others know about them.

10. Not taking any SEO training

How will you have meaningful discussions with your SEO team if you don’t know what they are talking about? 

How will you get Bob in IT to actually make changes to the website if he has no SEO knowledge? 

It is so critical that in-house teams have a baseline understanding of SEO, as well as to keep up with emerging strategies. 

SEO training is an excellent way to get teams up-to-speed on SEO. 

This proactive step helps ensure you are making sound decisions and can keep things moving forward.  

Let go of these spooky SEO tactics

These are spooky times and, unfortunately for many websites, scary SEO tactics still exist. 

Give the axe to the 10 items in this article and you will have a chance at competing in the search results in 2023.

The post 10 spooky SEO tactics to axe in 2023 appeared first on Search Engine Land.

Microsoft Performance Max import updates

Microsoft Ads has just launched a solution within the Google Import tool to simplify duplicating your Performance Max campaigns across platforms.  

They’ve also started a pilot program for importing Performance max campaigns that aren’t using a Merchant Center. The new experience will import these campaigns as Search campaigns and create Dynamic Search Ads (DSA). 

Best practices. Microsoft outlines the following best practices for importing Performance Max campaigns using the Merchant Center.

The following are best practices for importing Performance Max campaigns without Merchant Center.

Use the Google Import Tool. You can access the Google Import Tool here.

Dig deeper. Read the Microsoft blog announcement and access the setup checklist here.

Why we care.

The post Microsoft Performance Max import updates appeared first on Search Engine Land.

5 things your Google Looker Studio PPC Dashboard must have

If you’re just getting started with Google Looker Studio, you’ve probably experienced blank-page syndrome.

You get your data source connected, open up a new file, and you have no idea what to do next.

There are no instructions. No guide rails. Just you and an empty page to fill.

And while you can start with a template (Google Looker Studio Report Gallery has several), it’s still tough to know how to customize it to perfectly fit your needs.

Here are some tried-and-true elements to include in PPC dashboards and reports that will banish blank-page syndrome and give your stakeholders the insights they crave.

1. Titles, subheads and context

When you add a chart in Google Looker Studio, you select the data source, dimensions, metrics and date range from the Data Panel to populate your visualization.

But your reader doesn’t see the Data Panel and won’t know what your chart is about unless you take an extra step to include it in your dashboard.

The two graphs below show identical data visualizations. Figure A includes only the chart, while Figure B includes written titles and context.

Figure A leaves questions in your reader’s mind that Figure B answers.

You can make your graphs and tables easier to understand at a glance with these tips:

  • Give your data visualizations a title.
  • Use subheadings and microcopy for additional context.
  • Use legends.
  • State the date range if it’s not included in the chart. (Note: “auto” date range defaults to last 28 days.)
  • If multiple data sources are used throughout your dashboard, clarify which is used in specific charts.

How to do it:

  1. Add a text box and write out your titles and descriptions.
  2. This will open up a “Text Properties” panel to edit fonts, text size, and styling elements.

It’s worth the small manual effort it takes to add a text box and include context!

2. KPI scorecards

You don’t need an article to tell you that your dashboard should include your key performance indicators (KPIs).

But while you’re planning out your dashboard, pay special attention to where to include them.

Your KPIs matter most in your report and deserve top billing.

That means showcasing your KPIs with scorecards like so:

Not as afterthoughts at the end of a table:

Not only do tables make it hard to identify KPIs, for languages that are read left to right, tucking KPIs on the far right of the table tells your reader these metrics are low priority.

Keep your reader focused on your key growth metrics like lead volume, revenue, or return on ad spend (ROAS), rather than vanity and traffic metrics like impressions and clicks.

How to do it:

  1. Use Chart > Scorecard.
  2. In the “metric” section of the Data Panel, add your KPI. Repeat as needed.
  3. Control format and size in the Style Panel.

Having KPIs appear in tables and other charts isn’t a problem, but give them added attention by using scorecards.

3. Goal pacing

Some advertisers use fixed monthly or annual marketing budgets with no room for adjustments.

Others have sales or efficiency goals they need to hit with flexible budgets.

No matter what the approach, your dashboard should answer the question:

Are we meeting our objectives, and how do we know?

Account objectives aren’t standardized, and neither is the approach for including goal pacing in your dashboard.

Fortunately, Looker Studio gives you many options for adding objectives and pacing, from literally charting against a goal to adding a written description of the target.

Here are some examples of how you might anchor performance to a goal:

How to do it:

  1. Option: Add a header that states the objective
  2. Option: Use a pacing chart such as bullet or gauge
  3. Option: Add a calculated field with progress to goal (metric/target)

Including goal pacing gives your reader confidence in how to interpret performance data.

4. Trends and historical comparisons

Trends and historical comparisons let your reader know if things are improving – or need improvement – over time.

Maybe you fell short of the goal, but you always miss it because it’s unrealistic.

Maybe you hit your goal, but you’re down compared to last year, and you need to take corrective action.

Don’t make your reader wonder whether current performance is average, down or “best month ever.”

Snapshot (single-metric) comparisons

Tables and scorecards give you an easy way to show your reader how performance for this period compares to another, using color-coded arrows to indicate the direction of the change (delta).

How to do it:

  • Under “Date range,” select your comparison date range:
    • Fixed
    • Previous period
    • Previous year
    • Advanced
  • In the Style Panel:
    • Control the color of positive or negative change arrows
    • For Scorecards only, you can select whether to show absolute or percentage change and whether to include a description of the previous time period (comparison label).
    • You can also format Scorecards to show both YoY and MoM comparisons.

Line charts

You can get a complete picture of performance trends using time series charts. 

Rather than just comparing this period to the last period, you’ve got an entire history revealing trends in seasonality, market impact and more.

You can use a continuous Time series chart (shown above) or designate a comparison time period.

Here’s how that same data looks as a Year over Year (YoY) Time series chart. Note that the comparison year will show as a lighter shade of this period’s line:

Another way to show historical performance is with a line chart that uses a time period as a breakdown dimension.

This Line chart is from a report comparing CPCs before and during the Covid-19 pandemic:

How to do it:

  • To compare two time periods: Use a Time series chart and select a comparison date range.
  • To compare three or more time periods (shown here for years):
    • Select a Line chart
    • Set the “Dimension” to Month
    • Set the “Breakdown Dimension” to Year
    • Set the “Sort” to Month
    • Set the “Secondary sort” to Year

A few important notes for trends and historical comparisons – 

  • Only use these for your KPIs or metrics that directly contribute to your KPIs. Don’t add a CTR trend chart just for the sake of including a trend chart.
  • There’s almost never a reason to show daily granularity in these charts. Zoomed in that closely, you’ll miss the signal for the day-to-day noise. Look for trends at a monthly level.

5. Categorical tables

Okay, so tables aren’t that glamorous. 

But if your Looker Studio dashboard doesn’t have a table, something’s probably missing.

Why? Because there are times when your audience needs to compare multiple categories across multiple metrics. And nothing does that more efficiently than a table.

Tables are great for comparing default categories like:

  • Campaigns
  • Ad groups
  • Keywords
  • Search terms
  • Final URLs

And depending on the complexity of your PPC dashboard, you can create tables for:

  • Engines and platforms
  • Channels and networks
  • Funnels / intent / stages of awareness
  • Brand vs. nonbrand
  • Pivots of time segments, conversion types, and other categories

How to do it:

  • Chart > Table (or Pivot table)
  • Dimension(s): the category or categories you want to compare
  • Metrics: your KPIs and supporting metrics
  • From the Style Panel, you can format your table to include heatmaps, bars and targets

It’s easy to build tables and add metrics, and it’s easy to get carried away. Exercise some restraint and limit the number of metrics in your table, so it remains useful to your reader.

Bonus: Shiny charts

Our list constrained us to five categories, but here’s one bonus for making it to the end:

Shiny charts.

What are shiny charts?

Shiny charts are visualizations that your audience loves and gets excited about, even if they’re not super actionable.

Your readers may not learn anything new, but they’ll feel like they learned something new.

Maps are a great example. 

Many dataviz experts say not to use map charts; there are better ways to communicate location data. 

But try to find a client or stakeholder who doesn’t love to see performance data on a map. Go ahead. I’ll wait.

Sure it’s a bit counterintuitive when you’re trying to build out an actionable dashboard. Maybe even a bit controversial. And you don’t have to do it. But a chart that makes your audience feel good just for seeing it has its own merit.

Putting it all together

While your Looker Studio dashboard can technically include whatever you want, it should at a minimum include:

  • Title and context
  • KPI scorecards
  • Goal pacing
  • Historical comparisons
  • Categorical tables

These don’t need to (and can’t) all be discrete sections. One scorecard can include a title, KPI, pacing, and time comparison.

There are many other charts and visualizations that can take your PPC dashboard from good to great. Getting started with this list will set you up for success and give you a dashboard worth the time it took to build.

The post 5 things your Google Looker Studio PPC Dashboard must have appeared first on Search Engine Land.

Friday, October 28, 2022

13 essential SEO skills you need to succeed

What is the greatest skill in SEO?

If you believe this tweet, it’s patience.

Although patience is a great answer, I would never say there is a “greatest” SEO skill.

Why?

Because SEO requires various hard skills (things you can learn or be taught) and soft skills (how you work and interact with others) to succeed.

As I’ve always found, asking many SEO professionals one question will get a wide variety of opinions. So I asked several SEO professionals what they would consider the greatest skill in SEO.

Here’s what they told me.

1. Research and troubleshooting

Dave Davies, Lead SEO, Weights & Biases  

  • “As far as greatest skills go, I have to go with the stock SEO answer: It depends. In this case, though, it really does.

    If the practitioner is content-focused, then writing skills combined with strong research abilities (both SEO and subject-based) would definitely top my list. If the practitioner is a technical SEO, then the most important skills skew to technical knowledge – but even that branches out.

    If they work as a contractor, then they likely need to have a broad understanding of different tech and a strong ability to research specifics and work with developers. If they’re an in-house SEO or platform-specific contractor, then they would likely need a stronger grasp of a specific stack, and possibly deployment capabilities.

    The one skill that every SEO needs is research and troubleshooting capabilities. If they can’t do that, their career will be short. Thankfully, if you’re reading this, you do your research.”

2. Critical thinking

Dan Taylor, Head of Technical SEO, SALT.agency  

  • “For me, one of the greater skills for SEO professionals to develop is critical thinking. The SEO ecosystem is awash with noise and claims, with varying levels of data and anecdotal evidence to support them. Far too often, much of this content and advice is taken verbatim and applied to own situations without a second thought, with the expectation that the implementation will yield the same results.

    A common example in SEO happens when working with a client (and other stakeholders) on a website redesign. More often than not, the designs, and some proposed technical implementations, will be taken from other websites without considering the ‘other’ factors that go into how a website ranks. Just because eBay, the BBC, Amazon, etc. do X, doesn’t mean X will work for Bob’s Hardware or Bob’s Finance Co.

    With critical thinking, SEOs should read a study online, see the inputs and outcomes, and then intentionally find other studies that contradict these results – and then form their own opinions and influence their strategies with application to the current client scenario.”

John McAlpin, Director of SEO Strategy, Cardinal Digital Marketing

  • “Most experienced SEO professionals will likely say that the most important skill is a soft skill. I’d say that’s pretty spot-on.

    In my opinion, the most important SEO skill is critical thinking. Not everything in SEO is black and white. There’s always a middle ground. It takes real detective skills to find the answers.”

3. Problem-solving

Elmer Boutin, VP of Operations, WrightIMC

  • “Problem-solving and the ability to see the big picture. We often get so bogged down in the minutia of various SEO tasks that we miss the big picture of what’s going on between the website, the search engines, the potential visitors, and the business the website represents.

    Taking a step back and looking at things from end to end can help us solve problems more than fretting over things like meta description length or figuring out the optimum time to release a blog post.”

Corey Morris, Chief Strategy Officer, Voltage  

  • “The greatest skill in SEO is problem-solving. It rarely goes according to plan.

    Adapting, finding new ways, and exploring all resources for technical, on-page, IT, UX and off-page factors are all critical to success. Using a checklist, and checking boxes, won’t get you far.”

4. Experimenting

Himani Kankaria, Founder, Missive Digital

  • “The greatest skill in SEO is experimenting. One thing that works for one business or industry is not necessarily what would work for another. You cannot judge that without testing.

    Also, most of the projects have different audience buying and browsing perspectives, technologies for the website UX, content, how we build the navigation, etc. So you need to keep checking what works for your website, client, or employer because SEO is ever-evolving, and one cannot learn or unlearn new or outdated things without experimenting. Sometimes, we learn from someone else’s experiments, so experimenting is a super duper skill in SEO.”

5. Business acumen

Trond Lyngbø, Founder, Search Planet 

  • “As an SEO consultant specializing in enterprise-level SEO consulting for enterprise ecommerce companies and omnichannel retailers, I value SEO strategies that are not SEO strategies for Google, but SEO for business performance results, productivity and economic growth.

    A solid understanding of business, business processes, workflow automation and cross-functional alignment is worth gold in this segment.”

Connie Chen, SEO Specialist, Moving Traffic Media

  • “Commercial awareness (ROI) and soft skills because you need to be able to translate the work that you’re doing into measurable impact for your stakeholders. You must also know how to distill technical ideas into concepts that make sense to those stakeholders.”

6. Adaptability 

Maria White, Head of SEO, Kurt Geiger

  • “SEO is a spectrum. As such, it is hard to dominate all skills (Technical, Data, Content, PR, Story Telling, Management and more). These skills change as Google and search evolve. Every Google algorithm shapes the way we do SEO.

    Given that the only constant is change, then the best skill to succeed is the ability to adapt. Changes are the norm when working in an agency: new clients, bigger clients, various budgets, fast pace and more. If, along with that, we add constant changes to the algorithm that involves changing the way we work, then here is where only those with the ability to adapt will not only survive but they will thrive.”

Holly Miller Anderson, Lead SEO Product Manager, Under Armour

  • “The greatest skill in SEO is adaptability. SEO is a learned skill from technical to content. But adaptability is a choice.”

7. Communication

Casey Markee, Owner, Media Wyse

  • “SEO requires an ability to clearly explain concepts and objectives, usually in many different ways, to many different people. Your ability to do so, professionally and emotionlessly, is a big part of your daily success.”

8. Ability to learn

Chris Silver Smith, President, Argent Media

  • “The greatest skill in SEO is the ability to learn. One constant in SEO is change – one must learn new things and flex to adapt to new ways constantly.”

9. Persistence

Ludwig Makhyan, Co-founder, MAZELESS 

  • “Persistence is the greatest skill, where you don’t get disappointed by failures and continue to learn, push to improve, engage with other teams and cooperate for the greater success of the business you represent.” 

10. Cross-collaboration

Jon Clark, Managing Partner, Moving Traffic Media

  • “Cross-collaboration is an incredibly valuable skill. SEO is one of the few skills/roles that sits at the crossroads of nearly every department in an organization: design, development, content, UX, PR, marketing, engineering, quality assurance, analytics, and more.”

11. Understanding the user

Mike Grehan, SVP of Corporate Communications, NP Digital

  • “The greatest skill of an SEO is to fully understand the “information need” of an end user. They don’t start their research as a consumer or a B2C customer or B2B, or anything other than a human being trying to solve a problem. Help Google to help them – and Google will surely help you!”

12. Inquisitiveness

Mark Jackson, President and CEO, Vizion Interactive

  • “An inquisitive mind. In SEO,  you can be a technically gifted and super bright person or a content marketing expert, but if you lack an inquisitive mind, you may not be looking at a project from all the possible angles.”

13. Ability to know what matters

Olaf Kopp, Co-founder and Head of SEO, Aufgesang

  • “The greatest skill in SEO is “ranking experience.” You can do a lot in SEO. But only 20% of that ensures 80% of success. Only with enough experience in knowing which tasks are effective, in which case makes the difference between good and average SEOs.”

Joe Devita, Managing Partner, Moving Traffic Media

  • “The ability to stay focused on the optimization signals that matter rather than be distracted by all the noise.”

The post 13 essential SEO skills you need to succeed appeared first on Search Engine Land.

4 digital marketing pain points SMBs face today by Microsoft Advertising

To succeed as a small or medium-sized business (SMB), employees must work smarter. Tight budgets and scrappy teams require innovation at every level — from the Founder and CMO, e-commerce Marketing Director to VP of Marketing, Social Media Director to Paid Search Strategist. This opportunity to bring creativity and agility to the table is one of the many reasons why employees find SMBs rewarding workplaces. Employees can help define the company vision. They can imagine ways to actualize this vision. And often, with SMBs, the product offering aligns with employee values and belief systems. 

But let’s face it. The job of a digital marketing decision-maker within an SMB can be challenging— from the long hours to shifting budget priorities. Some might say digital marketing for an SMB is just as hard as creating the company product, thanks to ever-changing platforms, resources, content demands, and time constraints. Getting seen by the right audience can be difficult.

A challenging digital marketing world

Tasked with a new campaign, the digital marketing lead faces questions on how to create, target and execute digital advertising: When do you launch a digital campaign? What platform do you advertise on? How do you reach your target audience? What do you even post?

Digital marketers understand online advertising is key to amplifying their brand, but often they’re not clear where to begin; SMB advertising starts to feel like a ball-and-chain. Rapid changes in the advertising industry also contribute to marketer overwhelm. Consumer behaviors and demands for privacy are forcing brands to adapt how they reach and best serve people’s needs. While the digital marketing space can feel overwhelming, now is a crucial time for marketers to forge deeper relationships with people.

You need to continually create and revamp the things that work, staying timely. It never ends. We have to keep tweaking our graphics to make sure they’re going to catch attention. It’s not like back in the day when you could run an ad in a newspaper and that same ad was going to run for six weeks, and you were done with it. You constantly have to make content to stay top-of-mind.

To help accelerate SMB growth, Microsoft developed a quantitative and qualitative research program to better understand how SMBs manage their digital marketing today and to identify the pain points digital marketing leads face. Advertising decision makers of companies with less than 200 employees participated in a 15-minute survey to understand their needs and top pain points within the digital advertising landscape. Microsoft later conducted qualitative interviews to dig deeper into SMB needs. Most respondents noted they had few internal resources to support their efforts and often relied on agencies or freelancers for support (typically for content production or digital campaign management and optimization).

Four universal digital marketing pain points

The SMB founders and employees surveyed had different marketing POVs and experienced challenges unique to their roles. Some respondents were big-picture thinkers struggling to keep up with the ever-changing digital landscape, while others valued flexibility, optimization and results but had a tough time justifying marketing dollars spent. Many focused outward, eager to explore new ways to become and stay relevant on emerging social platforms like TikTok, while other leaders spent a large amount of time inside marketing brainstorming sessions, developing strategy with their team.

Although the findings were as diverse (and interesting) as the businesses surveyed, four clear SMB challenges emerged as shared frustrations: content creation, time and resource constraints, platform fragmentation, and evaluating ROI/ROAS. Read on for details about shared SMB marketer pain points.

Content creation

A shared frustration with SMB respondents was the volume and type of content deliverables required per campaign — and the reality that content takes time and effort to create. Many people and review cycles are necessary to strategize, envision, develop, write, design, update and optimize a successful digital marketing campaign. This constant demand for content tests SMBs’ bandwidth to fully execute the planned marketing vision.

Coming up with the right content and being able to put that content together…that’s driving everything these days. Having the right content and current content. For me, it’s a resource thing, I don’t have the resources to generate content in a timely manner. Keeping it fresh.

Time and resource constraints

The list of tasks for digital advertising marketers is extensive — from determining the marketing budget to developing and distributing content, managing digital campaigns, to analyzing and optimizing marketing efforts…and everything in between. Marketers have much to do but little time or resources to get the work done. The study shows that, out of necessity, SMB digital marketers are forced to either become marketing “Jacks and Jills of all trades” or are left scrambling to find freelancers or agencies to get tasks accomplished. The result? Mixing varying resources produces varying results.

You spend a lot of time to do a quality check on content, because once it goes up there, it’s up there…Creating and monitoring content has been the thing that takes the most time because it’s only one piece of what I do here, and I’m the only person that’s doing it right now…it just takes time.

Platform fragmentation

The stress of platform fragmentation is another shared SMB hurdle, as marketers face large quantities of information to learn, manage, and analyze. This is because digital marketers use many campaign and reporting platforms with unique algorithms and ad formats. Plus, these platforms and tools are continuously evolving. Digital advertising leaders experience overwhelm, not only with the many CMS platforms, reporting tools and analytic insights available but also with feature updates within each tool. Forced to keep up with platforms and upgrades, marketers experience pressure to hire more people to diversify the learning load.

Algorithms change all the time. You have to be specialized in digital marketing to understand everything. My marketing department also works in shopper marketing and trade marketing. The amount of specialization that we can get to a degree is limited.

Evaluating ROI/ROAS

Evaluating a campaign’s return on investment (ROI) and return on ad spend (ROAS) has always been challenging — but respondents note that it is even harder to track conversions and gauge true campaign ROI/ROAS with recent privacy changes. Another shared challenge is the perceived lack of standardized metric consistency or transparency across platforms. When developing reports for executives, respondents are tasked with piecing together multiple reports from different platforms to paint a clear picture of a campaign’s tangible return on investment.

I think conversions are really frustrating. Some of that is the iOS update stuff that happened recently…it’s not accurate because it’s not giving conversions from Facebook. I want to have a completely accurate ROI for digital campaigns, and I don’t feel I’m getting that.

SMB digital advertising solutions

In today’s digital space, businesses need smarter solutions to grow their business online and find new customers. SMBs are time-constrained and know every click matters. That is why Microsoft Advertising offers its newly redesigned Smart Campaigns experience to make online advertising easier and help small to medium-sized businesses reach more customers across leading advertising and social media platforms.

Smart Campaigns empowers digital marketing leaders to easily reach high-value customers across the web who have higher buying power, spend more online, and are more likely to engage with ads. It’s easy to get started. Marketers set up ads in a matter of minutes while they watch in real-time as the platform intuitively improves the ad, measures its performance, and shows understandable results across platforms.

A new feature within Smart Campaigns is Multi-platform. With Multi-platform, SMBs can expand their reach and maximize their investment by using one ad tool to target many channels like Google Ads, Facebook, Twitter, Instagram, LinkedIn and Microsoft Advertising. Instead of creating ads on fragmented platforms to launch and monitor a campaign, SMBs can save time by running ads on multiple platforms in minutes.

Smart Campaigns with Multi-platform is a new digital marketing ecosystem designed to eliminate SMB digital marketing pain points and connect marketers with people at the right moments across work and life.   

Hearing that everything could be in one place and that you could manage it all on different platforms, that’s exciting and innovative…I would be very interested. Being able to log on and do everything and see everything in one spot would save me time. That would be amazing.

The post 4 digital marketing pain points SMBs face today appeared first on Search Engine Land.

Thursday, October 27, 2022

Google Ads podcast placement now available

Google advertisers can now advertise on podcasts.

Yep, announced today, “advertisers can now align their ads with podcast content globally. Simply create an audio or video campaign and select “Podcast” as a placement.”

Why we care. Last month we reported on three new audio, shopping, and streaming features available to YouTube advertisers. Today advertisers can officially select “podcast” as their preferred placement.

The post Google Ads podcast placement now available appeared first on Search Engine Land.

Licensed healthcare providers eligible to apply for new YouTube product features

YouTube has just announced that licensed healthcare providers can now apply to make their channels eligible for new health product features – a suite of information resources released last year.

What this means. The health product features previously launched include health source information panels to help viewers identify videos from authoritative sources and health content shelves that highlight videos from these sources when you search for health topics, so people can more easily navigate and evaluate health information online.

Previously, those features have only been available to educational institutions, public health departments, hospitals, and government entities. The new guidelines will make the features available to a wider group of healthcare providers.

How to apply. Eligible healthcare providers can apply starting today using the guidelines below, taken directly from the YouTube blog announcement.

Applicants must have proof of their license, follow best practices for health information sharing as set out by the Council of Medical Specialty Societies, the National Academy of Medicine and the World Health Organization, and have a channel in good standing on YouTube. Full details on eligibility requirements are here.

All channels that apply will be reviewed against these guidelines, and the license of the applying healthcare professional will be verified. In the coming months, eligible channels that have applied through this process will be given a health source information panel that identifies them as a licensed healthcare professional and their videos will appear in relevant search results in health content shelves. Health creators in the US can apply starting October 27th at health.youtube, and we’ll continue to expand availability to other markets and additional medical specialties in the future.

Why we care. YouTube is trying to help people become more informed, engaged and empowered about their health by attempting to create a space where they can find reliable, factual, and informative content from legitimate healthcare providers.

However, not every licensed healthcare provider shares safe, proven, harmless content. Users should still do their due diligence to ensure that the content they are consuming is high quality.

Additionally, advertisers who work with licensed providers should apply for the new features today to ensure their channels have added visibility.

The post Licensed healthcare providers eligible to apply for new YouTube product features appeared first on Search Engine Land.

Reddit is building an Ads API, first 4 partners announced

Reddit is working on a new Ads API and have just announced their first four alpha partners. The partners will be integrated into the API and are helping build solution that will inevitably help advertisers build, scale, and optimize campaigns.

Who are the partners. The four partners involved in Reddit’s new venture are:

  • Vidmob
  • Sprinklr
  • adMixt
  • PMG

Who will benefit. Reddit’s API will benefit advertisers spending at scale, as well as self-serve advertisers who are using Reddit ads for the first time.

Release date. The API is still in development and there is no release date published at this time. Reddit says they are looking to “bring more strategic developers on board in the coming months.” 

What Reddit says. “We have long had the aspiration to build an ecosystem of partners via our API that enables more effective and efficient campaign management on our platform. The Reddit Ads API will allow a global, diverse set of partners and clients to access all the capabilities we have built and continue to develop to drive performance,” said Reddit COO, Jen Wong. “These foundational alpha partners represent some of the best and brightest across the industry in terms of innovation, creativity and adtech. Having them join our developer ecosystem is tremendously exciting.”

Why we care.

The post Reddit is building an Ads API, first 4 partners announced appeared first on Search Engine Land.

GA4 gets new homepage experience, 5 new features

Google has just unveiled several new Google Analytics 4 (GA4) updates, including a new homepage experience, real-time behavioral modeling reports, and custom channel grouping.

Insights through machine learning

Behavioral modeling. Behavior modeling with real time reporting will give advertisers a complete picture of user behavior as it happens, in a privacy-centric way. 

Behavior modeling uses machine learning to fill in the gaps of your understanding of customer behavior when cookies and other identifiers aren’t available.

Real-time updates will be available in the near future to give advertisers a complete view of the customer journey as it’s happening.

The new home page experience. Originally previewed at Google Marketing Live and available to all advertisers as of today, is personalized for customers, highlighting key top-line trends, real-time behavior and their most viewed reports. Additionally, it uses machine learning to look for trends and insights and surfaces them directly to advertisers on the home page.

Integrations and solutions to power better ROI

Data-driven attribution (DDA). DDA was introduced into GA4 earlier this year, after becoming the default for all ads conversions last fall. Soon, Google will launch custom channel grouping, a feature that lets advertisers combine different channels to compare cost-per-acquisition and return-on-ad-spend based on data-driven attribution. For example, businesses will be able to compare the performance of their paid search brand with their non-brand campaigns.

Integration with Campaign Manager 360. With this integration, you’ll be able to see a more complete picture of your campaign performance alongside web and app behavioral metrics. 

GA4 Setup Assistant updates

Universal Analytics (UA) is sunsetting in 2023, so to help advertisers complete the transition easily, Google is rolling out an update early next year that will automatically help standard UA users set up their GA4 properties.

Google’s step-by-step guide will help you migrate to GA4 on your own if you choose to opt out of using the Setup Assistant. If you choose to utilize the Setup Assistant, you can access it in the admin section of your UA property.

Beginning early next year, the Setup Assistant will create a new Google Analytics 4 property for each standard UA property that doesn’t already have one. It should also be noted “the new GA4 properties will be connected with the corresponding UA properties to match your privacy and collection settings. They’ll also enable equivalent basic features such as goals and Google Ads links,” Google said.

New migration deadline for 360 customers

Additionally, we acknowledge that this is a complex transition, especially for enterprise customers which is why we’re pushing the migration deadline for 360 customers from October 2023 to July 2024, to ensure a successful setup. 

Dig deeper. You can read the full announcement and more details on the features on the Google Marketing Blog.

Why we care. GA4 has been an inconvenient thorn in the garden of marketing. Maybe that’s being dramatic, but most advertisers are putting off implementing the new Analytics property to the very last minute. This is a bad idea.

If you haven’t set up GA4 yet, get started on it asap. Utilize the Setup Assistant to begin collecting data, then go back later and customize your dashboard and create additional views.

These new features are only helpful if you’re actually using the product.

More resources. Check out these additional resources to help you set up and get the most out of GA4:

The post GA4 gets new homepage experience, 5 new features appeared first on Search Engine Land.

Crawl efficacy: How to level up crawl optimization

It’s not guaranteed Googlebot will crawl every URL it can access on your site. On the contrary, the vast majority of sites are missing a significant chunk of pages.

The reality is, Google doesn’t have the resources to crawl every page it finds. All the URLs Googlebot has discovered, but has not yet crawled, along with URLs it intends to recrawl are prioritized in a crawl queue.

This means Googlebot crawls only those that are assigned a high enough priority. And because the crawl queue is dynamic, it continuously changes as Google processes new URLs. And not all URLs join at the back of the queue.

So how do you ensure your site’s URLs are VIPs and jump the line?

Crawling is critically important for SEO

Content can't be curated by Google without being crawled.

In order for content to gain visibility, Googlebot has to crawl it first.

But the benefits are more nuanced than that because the faster a page is crawled from when it is:

  • Created, the sooner that new content can appear on Google. This is especially important for time-limited or first-to-market content strategies.
  • Updated, the sooner that refreshed content can start to impact rankings. This is especially important for both content republishing strategies and technical SEO tactics.

As such, crawling is essential for all your organic traffic. Yet too often it’s said crawl optimization is only beneficial for large websites.

 But it’s not about the size of your website, the frequency content is updated or whether you have “Discovered – currently not indexed” exclusions in Google Search Console. 

Crawl optimization is beneficial for every website. The misconception of its value seems to spur from meaningless measurements, especially crawl budget.

Crawl budget doesn’t matter

Crawl budget optimization to maximize the number of URLs crawled is misguided.

Too often, crawling is assessed based on crawl budget. This is the number of URLs Googlebot will crawl in a given amount of time on a particular website.

Google says it is determined by two factors:

  • Crawl rate limit (or what Googlebot can crawl): The speed at which Googlebot can fetch the website’s resources without impacting site performance. Essentially, a responsive server leads to a higher crawl rate.
  • Crawl demand (or what Googlebot wants to crawl): The number of URLs Googlebot visits during a single crawl based on the demand for (re)indexing, impacted by the popularity and staleness of the site’s content.

Once Googlebot “spends” its crawl budget, it stops crawling a site.

Google doesn’t provide a figure for crawl budget. The closest it comes is showing the total crawl requests in the Google Search Console crawl stats report.

So many SEOs, including myself in the past, have gone to great pains to try to infer crawl budget. 

The often presented steps are something along the lines of:

  • Determine how many crawlable pages you have on your site, often recommending looking at the number of URLs in your XML sitemap or run an unlimited crawler.
  • Calculate the average crawls per day by exporting the Google Search Console Crawl Stats report or based on Googlebot requests in log files.
  • Divide the number of pages by the average crawls per day. It’s often said, if the result is above 10, focus on crawl budget optimization.

However, this process is problematic.

Not only because it assumes that every URL is crawled once, when in reality some are crawled multiple times, others not at all.

Not only because it assumes that one crawl equals one page. When in reality one page may require many URL crawls to fetch the resources (JS, CSS, etc) required to load it. 

But most importantly, because when it is distilled down to a calculated metric such as average crawls per day, crawl budget is nothing but a vanity metric.

Any tactic aimed toward “crawl budget optimization” (a.k.a., aiming to continually increase the total amount of crawling) is a fool’s errand.

Why should you care about increasing the total number of crawls if it’s used on URLs of no value or pages that haven’t been changed since the last crawl? Such crawls won’t help SEO performance.

Plus, anyone who has ever looked at crawl statistics knows they fluctuate, often quite wildly, from one day to another depending on any number of factors. These fluctuations may or may not correlate against fast (re)indexing of SEO-relevant pages.

A rise or fall in the number of URLs crawled is neither inherently good nor bad. 

Crawl efficacy is an SEO KPI

Crawl efficacy optimization to minimize the time between URL (re)publication and crawling is actionable.

For the page(s) that you want to be indexed, the focus shouldn’t be on whether it was crawled but rather on how quickly it was crawled after being published or significantly changed.

Essentially, the goal is to minimize the time between an SEO-relevant page being created or updated and the next Googlebot crawl. I call this time delay the crawl efficacy.

The ideal way to measure crawl efficacy is to calculate the difference between the database create or update datetime and the next Googlebot crawl of the URL from the server log files.

If it’s challenging to get access to these data points, you could also use as a proxy the XML sitemap lastmod date and query URLs in the Google Search Console URL Inspection API for its last crawl status (to a limit of 2,000 queries per day).

Plus, by using the URL Inspection API you can also track when the indexing status changes to calculate an indexing efficacy for newly created URLs, which is the difference between publication and successful indexing.

Because crawling without it having a flow on impact to indexing status or processing a refresh of page content is just a waste.

Crawl efficacy is an actionable metric because as it decreases, the more SEO-critical content can be surfaced to your audience across Google.

You can also use it to diagnose SEO issues. Drill down into URL patterns to understand how fast content from various sections of your site is being crawled and if this is what is holding back organic performance.

If you see that Googlebot is taking hours or days or weeks to crawl and thus index your newly created or recently updated content, what can you do about it?


Get the daily newsletter search marketers rely on.

Processing…Please wait.


7 steps to optimize crawling

Crawl optimization is all about guiding Googlebot to crawl important URLs fast when they are (re)published. Follow the seven steps below.

1. Ensure a fast, healthy server response

server response

A highly performant server is critical. Googlebot will slow down or stop crawling when:

  • Crawling your site impacts performance. For example, the more they crawl, the slower the server response time.
  • The server responds with a notable number of errors or connection timeouts.

On the flip side, improving page load speed allowing the serving of more pages can lead to Googlebot crawling more URLs in the same amount of time. This is an additional benefit on top of page speed being a user experience and ranking factor.

If you don’t already, consider support for HTTP/2, as it allows the ability to request more URLs with a similar load on servers.

However, the correlation between performance and crawl volume is only up to a point. Once you cross that threshold, which varies from site to site, any additional gains in server performance are unlikely to correlate to an uptick in crawling.

How to check server health

The Google Search Console crawl stats report:

  • Host status: Shows green ticks.
  • 5xx errors: Constitutes less than 1%.
  • Server response time chart: Trending below 300 milliseconds.

2. Clean up low-value content

If a significant amount of site content is outdated, duplicate or low quality, it causes competition for crawl activity, potentially delaying the indexing of fresh content or reindexing of updated content.

Add on that regularly cleaning low-value content also reduces index bloat and keyword cannibalization, and is beneficial to user experience, this is an SEO no-brainer.

Merge content with a 301 redirect, when you have another page that can be seen as a clear replacement; understanding this will cost you double the crawl for processing, but it’s a worthwhile sacrifice for the link equity.

If there is no equivalent content, using a 301 will only result in a soft 404. Remove such content using a 410 (best) or 404 (close second) status code to give a strong signal not to crawl the URL again.

How to check for low-value content

The number of URLs in the Google Search Console pages report ‘crawled – currently not indexed’ exclusions. If this is high, review the samples provided for folder patterns or other issue indicators.

3. Review indexing controls

Rel=canonical links are a strong hint to avoid indexing issues but are often over-relied on and end up causing crawl issues as every canonicalized URL costs at least two crawls, one for itself and one for its partner.

Similarly, noindex robots directives are useful for reducing index bloat, but a large number can negatively affect crawling – so use them only when necessary.

In both cases, ask yourself:

  • Are these indexing directives the optimal way to handle the SEO challenge? 
  • Can some URL routes be consolidated, removed or blocked in robots.txt?

If you are using it, seriously reconsider AMP as a long-term technical solution.

With the page experience update focusing on core web vitals and the inclusion of non-AMP pages in all Google experiences as long as you meet the site speed requirements, take a hard look at whether AMP is worth the double crawl.

How to check over-reliance on indexing controls

The number of URLs in the Google Search Console coverage report categorized under the exclusions without a clear reason:

  • Alternative page with proper canonical tag.
  • Excluded by noindex tag.
  • Duplicate, Google chose different canonical than the user.
  • Duplicate, submitted URL not selected as canonical.

4. Tell search engine spiders what to crawl and when

An essential tool to help Googlebot prioritize important site URLs and communicate when such pages are updated is an XML sitemap.

For effective crawler guidance, be sure to:

  • Only include URLs that are both indexable and valuable for SEO – generally, 200 status code, canonical, original content pages with a “index,follow” robots tag for which you care about their visibility in the SERPs.
  • Include accurate <lastmod> timestamp tags on the individual URLs and the sitemap itself as close to real-time as possible.

Google doesn't check a sitemap every time a site is crawled. So whenever it’s updated, it’s best to ping it to Google’s attention. To do so send a GET request in your browser or the command line to:

How to ping Google after updating your sitemap

Additionally, specify the paths to the sitemap in the robots.txt file and submit it to Google Search Console using the sitemaps report.

As a rule, Google will crawl URLs in sitemaps more often than others. But even if a small percentage of URLs within your sitemap is low quality, it can dissuade Googlebot from using it for crawling suggestions.

XML sitemaps and links add URLs to the regular crawl queue. There is also a priority crawl queue, for which there are two entry methods.

Firstly, for those with job postings or live videos, you can submit URLs to Google's Indexing API.

Or if you want to catch the eye of Microsoft Bing or Yandex, you can use the IndexNow API for any URL. However, in my own testing, it had a limited impact on the crawling of URLs. So if you use IndexNow, be sure to monitor crawl efficacy for Bingbot.

URL inspection tool

Secondly, you can manually request indexing after inspecting the URL in Search Console. Although keep in mind there is a daily quota of 10 URLs and crawling can still take quite some hours. It is best to see this as a temporary patch while you dig to discover the root of your crawling issue.

How to check for essential Googlebot do crawl guidance

In Google Search Console, your XML sitemap shows the status “Success” and was recently read.

5. Tell search engine spiders what not to crawl

Some pages may be important to users or site functionality, but you don’t want them to appear in search results. Prevent such URL routes from distracting crawlers with a robots.txt disallow. This could include:

  • APIs and CDNs. For example, if you are a customer of Cloudflare, be sure to disallow the folder /cdn-cgi/ which is added to your site.
  • Unimportant images, scripts or style files, if the pages loaded without these resources are not significantly affected by the loss.
  • Functional page, such as a shopping cart.
  • Infinite spaces, such as those created by calendar pages.
  • Parameter pages. Especially those from faceted navigation that filter (e.g., ?price-range=20-50), reorder (e.g., ?sort=) or search (e.g., ?q=) as every single combination is counted by crawlers as a separate page.

Be mindful to not completely block the pagination parameter. Crawlable pagination up to a point is often essential for Googlebot to discover content and process internal link equity. (Check out this Semrush webinar on pagination to learn more details on the why.)

URL parameters for tracking

And when it comes to tracking, rather than using UTM tags powered by parameters (a.k.a., ‘?’) use anchors (a.k.a., ‘#’). It offers the same reporting benefits in Google Analytics without being crawlable.

How to check for Googlebot do not crawl guidance

Review the sample of ‘Indexed, not submitted in sitemap’ URLs in Google Search Console. Ignoring the first few pages of pagination, what other paths do you find? Should they be included in an XML sitemap, blocked from being crawled or let be?

Also, review the list of “Discovered - currently not indexed” – blocking in robots.txt any URL paths that offer low to no value to Google.

To take this to the next level, review all Googlebot smartphone crawls in the server log files for valueless paths.

Backlinks to a page are valuable for many aspects of SEO, and crawling is no exception. But external links can be challenging to get for certain page types. For example, deep pages such as products, categories on the lower levels in the site architecture or even articles.

On the other hand, relevant internal links are:

  • Technically scalable.
  • Powerful signals to Googlebot to prioritize a page for crawling.
  • Particularly impactful for deep page crawling.

Breadcrumbs, related content blocks, quick filters and use of well-curated tags are all of significant benefit to crawl efficacy. As they are SEO-critical content, ensure no such internal links are dependent on JavaScript but rather use a standard, crawlable <a> link.

Bearing in mind such internal links should also add actual value for the user.

How to check for relevant links

Run a manual crawl of your full site with a tool like ScreamingFrog’s SEO spider, looking for:

  • Orphan URLs.
  • Internal links blocked by robots.txt.
  • Internal links to any non-200 status code.
  • The percentage of internally linked non-indexable URLs.

7. Audit remaining crawling issues

If all of the above optimizations are complete and your crawl efficacy remains suboptimal, conduct a deep dive audit.

Start by reviewing the samples of any remaining Google Search Console exclusions to identify crawl issues.

Once those are addressed, go deeper by using a manual crawling tool to crawl all the pages in the site structure like Googlebot would. Cross-reference this against the log files narrowed down to Googlebot IPs to understand which of those pages are and aren’t being crawled.

Finally, launch into log file analysis narrowed down to Googlebot IP for at least four weeks of data, ideally more.

If you are not familiar with the format of log files, leverage a log analyzer tool. Ultimately, this is the best source to understand how Google crawls your site.

Once your audit is complete and you have a list of identified crawl issues, rank each issue by its expected level of effort and impact on performance.

Note: Other SEO experts have mentioned that clicks from the SERPs increase crawling of the landing page URL. However, I have not yet been able to confirm this with testing.

Prioritize crawl efficacy over crawl budget

The goal of crawling is not to get the highest amount of crawling nor to have every page of a website crawled repeatedly, it is to entice a crawl of SEO-relevant content as close as possible to when a page is created or updated. 

Overall, budgets don’t matter. It’s what you invest into that counts.

The post Crawl efficacy: How to level up crawl optimization appeared first on Search Engine Land.