Ethan Lazuk

SEO & marketing professional.


Hamsterdam Part 57: Weekly SEO & AI News Recap (5/6 to 5/12, 2024)

By Ethan Lazuk

Last updated:

A weekly look-back at SEO & AI news, tips, and other content shared on social media & beyond.

Hamsterdam Part 57 Weekly SEO News Recap from 5/6 to 5/12 with Google Search Liaison (Danny Sullivan) Quote
Source: Search Liaison

Opening notes:

Want these Hamsterdam recaps delivered? Subscribe to the free newsletter! (It’s pretty much a link to this article, but it’ll be conveniently emailed to you.) 😉

*Feel free to jump down to this week’s recap, or continue reading for “This week in SEO history,” an introduction, and a summary of the week’s SEO and AI news!


This week in SEO history: “user experience” is born (as a term)

The first known use of the term “user experience” happened during the CHI ’95 Conference Companion on Human Factors in Computing Systems, held May 7th-11th, 1995, in Denver, Colorado, according to the Web Design Museum.

This occurred in a presentation called “What You See, Some of What’s in the Future, And How We go About Doing It: HI at Apple Computer.”

That paper was submitted by Don Norman, Jim Miller, and Austin Henderson of Apple Computer, Inc. and is available from ResearchGate:

What You See, Some of What's in the Future, And How We go About Doing It: HI at Apple Computer

As we can see above, the abstract mentions, “the critical aspects of human interface research and application at Apple or, as we prefer to call it, the ‘User Experience.’”

Coincidentally, this week’s Hamsterdam History article looked at the topic of usability (now more commonly called UX).

Why does UX matter for SEO? Well, it applies to aspects of page experience, conversion rate optimization, and even information gain, as indicated by Andrew Holland’s SEL post (in this week’s Articles section below).

One of my favorite tools for understanding user behavior better is Microsoft Clarity, which can be used for a type of on-site anthropological research, seeing what real users experience.

Speaking of real …

Let’s get to our introduction this week, a commitment to authenticity.


Introduction to week 57: “authenticity vs. commercialism”

Scooby Doo ghost reveal.

I’ve written about the beef between Kendrick Lamar and Drake twice before in Hamsterdam introductions, including in Part 50 and last week.

The best article about it that I’ve seen was written from an anthropologist’s point-of-view — of course 😉 — in the New York Times. That can be found in the Miscellaneous section below.

There’s also a good Atlantic article.

But what I want to talk about is a line I saw from an MSNBC article, of all things:

“It’s the fight between authenticity and commercialism, between the most artfully made hip-hop and the hip-hop that flies off the shelves.”

– Charles F. Coleman Jr.

I don’t begrudge anyone for trying to make a name or sharing their passions.

But there is something to be said for going about it one way versus another.

About a year ago, I was working in agencies and feeling the need for exploration, to have an outlet for digging into what I felt mattered as opposed to what was in the interest of something else.

So I started to write for my own website, which I’d had for several years but never did much with.

Starting my own business in January then afforded me even more opportunity to create content, and I have a current goal of three posts per week (plus updates).

This also means I have Google Search Console data for a growing variety of topics, and I can see what’s popular (in terms of queries that get impressions).

I don’t choose blog topics that way, at least not anymore. But I often see a correlation between what’s popular in my GSC data and what gets shared most often on social media.

Why should I or anyone else care?

Well, I personally made a decision not to focus on those topics anymore, or even share them here. I see why they’re popular. I just don’t think they’re worthwhile.

The more I explore AI for Hamsterdam Research — and the more long-ass videos on deep learning I watch on YouTube, ha — the more I recognize an inherent silliness in certain topics. I also detect that viewpoint in the tone of others who know better.

The topics, while interesting, aren’t applicable to the goals of most sites owners, nor do they represent what the world of search entails. It’s a sliver of a relic.

WHAT am I talking about?

I’m not looking to get specific.

All I’m looking to say is that one of the larger themes from the Kendrick-Drake battle — the dichotomy between authenticity and commercialism — plays out in all aspects of life and society.

To that end, I’ll continue to learn about and share the information that I think has real value and present it as authentically as I know how.

That’s my promise to you and, most of all, to myself. 🙂

Buckle up for a full week’s recap, and enjoy the vibes (a song chosen in honor of my mom, who once told me it’s one of her favorites):

Thank you for supporting Hamsterdam and the cause of SEO & AI learning.

Missed last week? Don’t worry, I got you! Read Part 56 to catch up.

Other great sources of weekly SEO news:


Now, time for our weekly review of SEO social posts, articles, & more …

The Big Lebowski is this your homework Larry scene.

Quick summary

  • OpenAI isn’t releasing a search engine tomorrow, they say, but it might introduce a multimodal AI digital assistant
  • Screaming Frog’s v.20 update sparked a TON of cool possibilities (some listed in the technical SEO section); my favorite was Mike King’s use case (vector embeddings)
  • My secret article pick of the week is from DoorDash (building product knowledge graphs with LLMs), h/t: Marianne Sweeny on LinkedIn
  • Danny Sullivan had some interesting responses on X worth reviewing (see the News section below)
  • Several good articles this week, strewn about multiple sections below …
  • And more!

Jump to a section of this week’s recap:

Or keep scrolling to see it all.

Ok, time to step inside the white flags of Hamsterdam …

Hamsterdam scene from The Wire with Carver pointing at the white flags.

SEO news, Google updates, & SERP tests

Notable updates or news related to Google Search or related SEO topics.

Excerpt: “OpenAI has been showing some of its customers a new multimodal AI model that can both talk to you and recognize objects, according to a new report from The Information. Citing unnamed sources who’ve seen it, the outlet says this could be part of what the company plans to show on Monday.”

Here’s the updated SEL story.
I’d suggest going through this thread in full (so much cool stuff) and also reading Mike King’s post in the articles section below (my pick of the week!). There are more use cases in the technical SEO section, as well.

Excerpt: “Sullivan went on to say that sometimes Google changes the way a particular system works. This is what core updates are usually about: How are signals processed, what is the weighting, how do the individual areas interact? This is like updating the program code. And then there are the running systems. They process new data. This could lead to the results changing without the system itself being changed. The message to everyone who talks about never-ending updates is: The web is constantly evolving. It’s not a static environment.” (Translated.)

SEO tips & tidbits

Actionable tips, cool tidbits, and other findings and observations that can be teaching moments.

I know it’s way more complex, but I really see this idea of thresholds updated as akin to backpropagation minimizing the loss function (error) during gradient descent in neural networks. (Btw, we included a nice intro guide to backprop in the AI section of last week’s Hamsterdam recap.)

Site reputation abuse examples – Barry Adams

Barry Adams on LinkedIn with site reputation abuse examples.

Canonical date in GSC for de-indexed pages – Gary Illyes

Gary Illyes response about de-indexed page canonical dates on LinkedIn.

SEO (and AI) fundamentals & resources

Essential information, concepts, or resources to learn about SEO or AI.

How long does SEO really take? – Eli Schwartz, Eli’s Newsletter

How long does SEO really take newsletter.
Excerpt: “First, if something is broken on a website, such as a missing title tag or an orphaned page, it could take just 24 hours before the impact of the fix is noticed. This, of course, is contingent on the authority of the website. On a more popular website, the website will be seen by Google’s crawlers within hours, while on a smaller site, it could take significantly longer. However, in both cases, the fix will be implemented shortly after Google finds it.”

Articles, videos, case studies & more

Longer-form content pieces shared on social, in newsletters, and elsewhere.

Excerpt: “While a lot of SEOs are going to use this upgrade to turn SFSS into Scrapebox, one of the operations that comes with the tool is code to generate vector embeddings from OpenAI as you crawl. This is specifically what is going to help you make the upgrade from lexical analysis to semantic.”

Agentic AI: When to Choose AI Agents vs GPTs for SEO and Why – Emilia Gjorgjevska, WordLift

Agentic AI When to Choose AI Agents vs GPTs for SEO and Why.
Excerpt: “Agentic AI emphasizes user experience, focusing on enhancing human-computer interactions to provide more intuitive, responsive, and personalized experiences. This aspect is crucial, aligning closely with user experience principles in web design and development that prioritize user-centric interfaces. When these principles are applied to SEO, known as Agentic SEO, they transform how websites are optimized for search engines.” (h/t: TechSEOTips Newsletter by Nikki Halliwell.) (Related: We covered Google Cloud Next’s AI agent demos in the introduction in Hamsterdam Part 53.)
Excerpt: “In March, the number of Person entities in the Knowledge Graph increased by 17%. By far, the biggest growth in new Person entities is people to whom Google is clearly able to apply full E-E-A-T credibility signals (researchers, writers, academics, journalists, etc.).”
Excerpt: “Schmidt was asked about the ‘blue link economy,’ and all the brands and businesses that have benefited from Google Search. But Google is changing. In the new world, AI will provide answers, not a list of websites for people to click on and find the answer for themselves.”
Excerpt: “If we’re going to optimize around information gain, we need to understand that it requires a greater understanding of two factors: Machine learning. Human learning. We already know that Google wants original, experience-based information from the best sources. They also want to reduce the cost of extracting that information. Yes, Google wants an easy life. So, how do we do this on a practical level? Simply put, we make extracting information easier for both machines and humans (at the same time), and here’s how.”

Excerpt: “Our early Google SGE research on July 1, 2023 revealed that the finance/investing vertical was only 22% covered. Our check in April 2024 showed 47% coverage of the same group of queries. We have been tracking Google SGE coverage since its inception and observing Google’s efforts to increase SGE coverage month after month. Based on these trends, my prediction is that we’ll see more Google SGE results as soon as Google gathers more data, allowing for a higher certainty of high-quality results with lower chances of AI hallucinations.”

Excerpt: “It doesn’t matter if you’ve got the latest skills or if you know the cutting edge approach to building on the web in 2024. I had neither when I started this, and still don’t a year later, though I certainly know a lot more than when I started. Instead: Have something to say and figure out the best possible way to say it. The rest will follow.”

What is Google Clamping Down On? Spring 2024 Updates – Tom Capper, Moz

What is Google Clamping Down On? Spring 2024 Updates - Tom Capper, Moz
Excerpt: “I think if I was considering rolling out AI-written content on my site right now, I would be very concerned about being caught in the crossfire here about Google not necessarily identifying particularly well what was and wasn’t helpful.” (Removed video to speed up page. Can view on YouTube.)

Local SEO

What’s happening in your local neck of the woods; well, actually in local search.

Technical SEO

Everything from basics to advanced moves (and also tools).

Excerpt: “Use Inspect to see what a change will look like before going live. This is especially helpful if you need to take a screenshot to get approval before or after you go live with a page edit or if you want to check how a change will look on desktop and mobile first.”

Content marketing

From what is helpful content to user journeys and beyond.

Evergreen-ish: Durable content for news SEO – Jessie Willms, WTF is SEO?

Evergreen-ish: Durable content for news SEO.
Excerpt: “With SGE, Google can draw from various sources and paraphrase content, potentially offering a more robust response to a reader’s query. On more basic evergreen queries, it’s possible SGE’s longer, bigger response will satisfy readers, negating the need to click on a variety of links. We can’t avoid this reality. But, the best defence against Google’s whims is a strong, loyal audience directly to your homepage and key section pages, while also staying engaged via push alerts, email and other efforts. Further, think about what your outlet can do that Google can’t, like 10x content (more on this later). Onsite evergreen is a chance for you to show off your value as a publication.”

How to Update Old Posts for SEO: Best Practices, 5 examples …and One Big Mistake – Andy Crestodina, Orbit Media Studios

How to Update Old Posts for SEO.
Excerpt: “You’ve published 100+ published articles over the years. Or you’ve taken over a website with a lot of legacy content. Some of these past articles already have 10x more value than the rest. Others are good but not yet great. Which ones should you update? The goal is to push more articles into that 10x results category. Some articles have the best opportunities to become big winners, with SEO and with visitors.”

Let’s talk Content Decay – Search Off the Record, Google Search Central

Content Decay Google Search Central.
Excerpt: “But that seemed to be like what, from my light research into this matter, it seems to be content decay in the SEO world seems to be about content that is declining in search interest. So it’s something that you would notice when you’re looking at Search Console that CTR has dropped off slowly over time, and people are not looking at your thing. And this is a signal that potentially your content has decayed over the last decade. I don’t know. The time period is unknown.” – Lizzi (Removed video to improve loading. Can view on YouTube.)

Data analysis & reporting

Showing that what you’re doing is helping.

AI, machine learning, & LLMs

A section dedicated to AI news, research, and articles.

New Microsoft AI model may challenge GPT-4 and Google Gemini – Benj Edwards, Ars Technica

New Microsoft AI model.
Excerpt: “The development of MAI-1 is being led by Mustafa Suleyman, the former Google AI leader who recently served as CEO of the AI startup Inflection before Microsoft acquired the majority of the startup’s staff and intellectual property for $650 million in March. Although MAI-1 may build on techniques brought over by former Inflection staff, it is reportedly an entirely new large language model (LLM), as confirmed by two Microsoft employees familiar with the project.”

The World Needs Something Better Than the Transformer – Mohit Pandey, Analytics India Magazine

The World Needs Something Better Than the Transformer.
Excerpt: “Regardless, researchers challenging Transformers is not new. The latest paper by Sepp Hochreiter, the inventor of LSTM, has unveiled a new LLM architecture featuring a significant innovation: xLSTM, which stands for Extended Long Short-Term Memory. The new architecture addresses a major weakness of previous LSTM designs, which were sequential in nature and unable to process all information at once.”

Why it matters: The paper on xLSTM discusses the limitations of LSTM (long short-term memory) and explains how xLSTM achieves better performance on exponential gating — replaces sigmoidal gates with exponential gating that provide a wider range of values for more fine-grained control of information flow, leading to better learning of long-term dependencies (a core challenge for sequence-based tasks like language modeling) — and modified memory structures — adjustments to how equations governing new added information and how old information is forgotten, improving the network’s ability to retain information over longer sequences, which is beneficial for tasks that require understanding context over extended periods, like the meaning of a sentence or predicting future events. This is a significant contribution to improving recurrent neural networks (RNNs), which can advance the field of deep learning and impact real-world applications.

Anthropic AI Launches a Prompt Engineering Tool that Generates Production-Ready Prompts in the Anthropic Console – Asif Razzaq and Nishant N., Mark Tech Post

Anthropic AI launches a prompt engineering tool.
Excerpt: “Prompt engineering has been gaining increasing attraction recently because people want to navigate AI more efficiently and get optimal outputs. But not everyone can be a prompt engineer or doesn’t have the time to learn it all; luckily for them, Anthropic, the creator company behind Claude large language model (LLM) and one of the biggest competitors of ChatGPT, has just announced a new prompt engineering tool that can turn your ideas into effective, precise and reliable prompts using Claude’s prompt engineering techniques.”

Why it matters: This prompt generator “can guide Claude to generate high-quality prompts tailored to your specific tasks.” It’s currently experimental.

Excerpt: “This survey paper addresses the absence of a comprehensive overview on Retrieval-Augmented Language Models (RALMs), both Retrieval-Augmented Generation (RAG) and Retrieval-Augmented Understanding (RAU), providing an in-depth examination of their paradigm, evolution, taxonomy, and applications. The paper discusses the essential components of RALMs, including Retrievers, Language Models, and Augmentations, and how their interactions lead to diverse model structures and applications.”

Why it matters: The full paper (PDF) goes through a ton of history and vocabulary that’s super helpful to understand for learning about machine learning and RAG, in particular. This says a lot about word embeddings and vector representations, which are helpful to understand for natural language processing (NLP) and semantic search, both relevant to SEO today.

“im-a-good-gpt2-chatbot” mystery – Perplexity thread

Perplexity thread on mystery chatbot.
Excerpt: “The emergence of ‘im-a-good-gpt2-chatbot’ and its counterpart ‘im-also-a-good-gpt2-chatbot’ has sparked widespread curiosity and speculation within the AI community. These mysterious AI chatbots reappeared on the LMSYS Org, a major large language model benchmarking site, displaying capabilities at or beyond the level of GPT-4, with some users asserting they even surpass the original models in performance. Their sudden appearance and the lack of clear information regarding their origins have led to a flurry of discussions and theories.”

Why it matters: These models are said to be comparable or better in some ways to GPT-4, which is the underpinning of ChatGPT Pro and is rumored to use a mixture-of-experts (MoE) model, similar to Gemini 1.5, and are speculated to come from OpenAI team members. It’s a wild story.

Excerpt: “Deep neural networks are highly expressive models that have recently achieved state of the art performance on speech and visual recognition tasks. While their expressiveness is the reason they succeed, it also causes them to learn uninterpretable solutions that could have counter-intuitive properties. In this paper we report two such properties.”

Why it matters: This paper (from 2014) touches on important concepts related to deep learning — the underpinning of today’s LLMs and certain search engine systems, including for ranking — such as interpretability, sparking a wave of further research in the field of AI. (Related: In the first Hamsterdam Research article on the embedding language model (ELM), we touched on interpretability in the context of embeddings but also deep learning models.)

Excerpt: “In Quantization in Depth you will build model quantization methods to shrink model weights to ¼ their original size, and apply methods to maintain the compressed model’s performance. Your ability to quantize your models can make them more accessible, and also faster at inference time.”

Why this matters: This course teaches practical skills for optimizing deep learning models (like LLMs). (Related, sorta: In this week’s Hamsterdam Research article on TeraHAC, there’s a top section that mentions quantization in reference to vector spaces for search engines (specifically Vertex AI).)

Excerpt: “We will highlight here how we use LLMs to extract product attributes from unstructured SKU data, allowing us to build a high-quality retail catalog that delivers the best possible experience for users in all new verticals. In the following sections, we describe three projects in which we used LLMs to build ML products for attribute extraction.”

Why it matters: It’s a fascinating read that shows the practical applications of LLMs (creating embeddings) and RAG for empowering ML models that personalize shopping experiences and recommendations, but more generally could apply to search overall. (Also h/t: Marianne Sweeny: “SEO community (and IA, content): Follow how to build knowledge graphs to better understand how search engines decide ranking. Yes, really.”)

h/t: Marianne Sweeny: “Illuminating and so important now. To where, I would who…perhaps the IA community will come around le learning more on how this data is used and extend their history in this area to solving the problem. Oh…wait..that is Next IA.”

TikTok content

It’s a search engine, right?

@private_collector Jim Reekes who worked as Apples sound designer in the 1980's with a backstory on how and why the sound was chosen. #apple #applemusic #synth #vintagesynth #mac #computer #sounddesign #fyp ♬ original sound – Turbo
@betweenusgirlies Being on a social media team is not all that fun #relatable #corporate #9to5 #socialmediamanager #betweenusgirlies @casecorradinn @bran_flakezz ♬ original sound – Between Us Girlies
@zander_whitehurst Design better sidebar navigations #ux #ui #figma #design #webdesign ♬ original sound – Zander Whitehurst • UX/UI

Humor

Subjectively funny content.

General marketing & miscellaneous

This is for great content that isn’t necessarily SEO or marketing-specific. PPC, PR, dev, design, and social friends, check it out!

And the Winner Is: Kendrick Lamar. And Old-School Hip-Hop. – NYT (Written by an anthropologist!)

NYT article on Kendrick Lamar.
Excerpt: “Mr. Lamar’s victory signals a resurgence of lyrically rich rap — and a return to the roots of hip-hop culture — all while establishing a new template for relevance in an era when content can go viral instantly on social media and streaming platforms. If Drake, who has become the face of rap’s mainstream pop faction, has lost this battle, that setback is not his alone.” (Note: Sorry if this is pay-walled. It was free for me but locked on 2nd visit.)

Older stuff that’s good!

Not everything I find worth sharing is new as of this week, so these are gems I came across published in the past.

Excerpt: “Traditional methods, such as wrappers, suffer from limited adaptability and scalability when faced with a new website. On the other hand, generative agents empowered by large language models (LLMs) exhibit poor performance and reusability in open-world scenarios. In this work, we introduce a crawler generation task for vertical information web pages and the paradigm of combining LLMs with crawlers, which helps crawlers handle diverse and changing web environments more efficiently. We propose AutoCrawler, a two-stage framework that leverages the hierarchical structure of HTML for progressive understanding. Through top-down and step-back operations, AutoCrawler can learn from erroneous actions and continuously prune HTML for better action generation.”

Gradient descent, how neural networks learn

How machines learn gradient descent video.
(Removed video to speed up page. Can view on YouTube.)

Great job making it to the end. You rock!

Want help with your SEO strategy?

I’m an independent SEO consultant based in Orlando, Florida, focusing on custom audits and strategies for brands. Don’t hesitate to reach out, or visit my about page for more information about me.

Let’s connect!

Hit me up anytime via text or call at 813-557-9745 or on social or email:

Cheers!

Editorial history:

Created by Ethan Lazuk on:

Last updated:

Need a hand with a brand audit or marketing strategy?

I’m an independent brand strategist and marketing consultant. Learn about my services or contact me for more information!

Leave a Reply

Discover more from Ethan Lazuk

Subscribe now to keep reading and get access to the full archive.

Continue reading

GDPR Cookie Consent with Real Cookie Banner