Hamsterdam Part 57: Weekly SEO & AI News Recap (5/6 to 5/12, 2024)
By Ethan Lazuk
Last updated:
A weekly look-back at SEO & AI news, tips, and other content shared on social media & beyond.

Opening notes:
- Welcome to another week of Hamsterdam!
- All weeks are created equal, but this one probably has the most good stuff in a while. 😉
- I haven’t had the Humor section for a few weeks; glad to see it back!
- In other Hamsterdam news …
- I wrote about the UX (usability) work/ideas of Jared Spool circa 2005 in Hamsterdam History.
- For Hamsterdam Research, we delved into TeraHAC, a graph clustering algorithm for large (trillion-edge!) datasets.
Want these Hamsterdam recaps delivered? Subscribe to the free newsletter! (It’s pretty much a link to this article, but it’ll be conveniently emailed to you.) 😉
*Feel free to jump down to this week’s recap, or continue reading for “This week in SEO history,” an introduction, and a summary of the week’s SEO and AI news!
This week in SEO history: “user experience” is born (as a term)
The first known use of the term “user experience” happened during the CHI ’95 Conference Companion on Human Factors in Computing Systems, held May 7th-11th, 1995, in Denver, Colorado, according to the Web Design Museum.
This occurred in a presentation called “What You See, Some of What’s in the Future, And How We go About Doing It: HI at Apple Computer.”
That paper was submitted by Don Norman, Jim Miller, and Austin Henderson of Apple Computer, Inc. and is available from ResearchGate:

As we can see above, the abstract mentions, “the critical aspects of human interface research and application at Apple or, as we prefer to call it, the ‘User Experience.’”
Coincidentally, this week’s Hamsterdam History article looked at the topic of usability (now more commonly called UX).
Why does UX matter for SEO? Well, it applies to aspects of page experience, conversion rate optimization, and even information gain, as indicated by Andrew Holland’s SEL post (in this week’s Articles section below).
One of my favorite tools for understanding user behavior better is Microsoft Clarity, which can be used for a type of on-site anthropological research, seeing what real users experience.
Speaking of real …
Let’s get to our introduction this week, a commitment to authenticity.
Introduction to week 57: “authenticity vs. commercialism”

I’ve written about the beef between Kendrick Lamar and Drake twice before in Hamsterdam introductions, including in Part 50 and last week.
The best article about it that I’ve seen was written from an anthropologist’s point-of-view — of course 😉 — in the New York Times. That can be found in the Miscellaneous section below.
There’s also a good Atlantic article.
But what I want to talk about is a line I saw from an MSNBC article, of all things:
“It’s the fight between authenticity and commercialism, between the most artfully made hip-hop and the hip-hop that flies off the shelves.”
– Charles F. Coleman Jr.
I don’t begrudge anyone for trying to make a name or sharing their passions.
But there is something to be said for going about it one way versus another.
About a year ago, I was working in agencies and feeling the need for exploration, to have an outlet for digging into what I felt mattered as opposed to what was in the interest of something else.
So I started to write for my own website, which I’d had for several years but never did much with.
Starting my own business in January then afforded me even more opportunity to create content, and I have a current goal of three posts per week (plus updates).
This also means I have Google Search Console data for a growing variety of topics, and I can see what’s popular (in terms of queries that get impressions).
I don’t choose blog topics that way, at least not anymore. But I often see a correlation between what’s popular in my GSC data and what gets shared most often on social media.
Why should I or anyone else care?
Well, I personally made a decision not to focus on those topics anymore, or even share them here. I see why they’re popular. I just don’t think they’re worthwhile.
The more I explore AI for Hamsterdam Research — and the more long-ass videos on deep learning I watch on YouTube, ha — the more I recognize an inherent silliness in certain topics. I also detect that viewpoint in the tone of others who know better.
The topics, while interesting, aren’t applicable to the goals of most sites owners, nor do they represent what the world of search entails. It’s a sliver of a relic.
WHAT am I talking about?
I’m not looking to get specific.
All I’m looking to say is that one of the larger themes from the Kendrick-Drake battle — the dichotomy between authenticity and commercialism — plays out in all aspects of life and society.
To that end, I’ll continue to learn about and share the information that I think has real value and present it as authentically as I know how.
That’s my promise to you and, most of all, to myself. 🙂
Buckle up for a full week’s recap, and enjoy the vibes (a song chosen in honor of my mom, who once told me it’s one of her favorites):
Thank you for supporting Hamsterdam and the cause of SEO & AI learning.
Missed last week? Don’t worry, I got you! Read Part 56 to catch up.
Other great sources of weekly SEO news:
- The SEO Weekly – Garret Sussman, iPullRank
- SEOFOMO – Aleyda Solis
- Weekly Video Recaps – Barry Schwartz, SER
- Weekly SEO News YouTube channel – Olga Zarr, Seosly
- Niche Surfer – Yoyao Hsueh
Now, time for our weekly review of SEO social posts, articles, & more …

Quick summary
- OpenAI isn’t releasing a search engine tomorrow, they say, but it might introduce a multimodal AI digital assistant
- Screaming Frog’s v.20 update sparked a TON of cool possibilities (some listed in the technical SEO section); my favorite was Mike King’s use case (vector embeddings)
- My secret article pick of the week is from DoorDash (building product knowledge graphs with LLMs), h/t: Marianne Sweeny on LinkedIn
- Danny Sullivan had some interesting responses on X worth reviewing (see the News section below)
- Several good articles this week, strewn about multiple sections below …
- And more!
Jump to a section of this week’s recap:
- News, Google updates, & SERP tests
- SEO tips & tidbits
- Fundamentals & resources
- Articles, videos & case studies
- Local SEO
- Technical SEO
- Content marketing
- Data analysis & reporting
- AI, LLMS, & machine learning
- TikTok section
- Humor
- Miscellaneous & general posts
- Older stuff that’s good!
Or keep scrolling to see it all.
Ok, time to step inside the white flags of Hamsterdam …

SEO news, Google updates, & SERP tests
Notable updates or news related to Google Search or related SEO topics.
Excerpt: “OpenAI has been showing some of its customers a new multimodal AI model that can both talk to you and recognize objects, according to a new report from The Information. Citing unnamed sources who’ve seen it, the outlet says this could be part of what the company plans to show on Monday.”
Excerpt: “Sullivan went on to say that sometimes Google changes the way a particular system works. This is what core updates are usually about: How are signals processed, what is the weighting, how do the individual areas interact? This is like updating the program code. And then there are the running systems. They process new data. This could lead to the results changing without the system itself being changed. The message to everyone who talks about never-ending updates is: The web is constantly evolving. It’s not a static environment.” (Translated.)
SEO tips & tidbits
Actionable tips, cool tidbits, and other findings and observations that can be teaching moments.
Site reputation abuse examples – Barry Adams

Canonical date in GSC for de-indexed pages – Gary Illyes

SEO (and AI) fundamentals & resources
Essential information, concepts, or resources to learn about SEO or AI.
How long does SEO really take? – Eli Schwartz, Eli’s Newsletter

Articles, videos, case studies & more
Longer-form content pieces shared on social, in newsletters, and elsewhere.
Excerpt: “While a lot of SEOs are going to use this upgrade to turn SFSS into Scrapebox, one of the operations that comes with the tool is code to generate vector embeddings from OpenAI as you crawl. This is specifically what is going to help you make the upgrade from lexical analysis to semantic.”
Agentic AI: When to Choose AI Agents vs GPTs for SEO and Why – Emilia Gjorgjevska, WordLift

Excerpt: “Our early Google SGE research on July 1, 2023 revealed that the finance/investing vertical was only 22% covered. Our check in April 2024 showed 47% coverage of the same group of queries. We have been tracking Google SGE coverage since its inception and observing Google’s efforts to increase SGE coverage month after month. Based on these trends, my prediction is that we’ll see more Google SGE results as soon as Google gathers more data, allowing for a higher certainty of high-quality results with lower chances of AI hallucinations.”
Excerpt: “It doesn’t matter if you’ve got the latest skills or if you know the cutting edge approach to building on the web in 2024. I had neither when I started this, and still don’t a year later, though I certainly know a lot more than when I started. Instead: Have something to say and figure out the best possible way to say it. The rest will follow.”
What is Google Clamping Down On? Spring 2024 Updates – Tom Capper, Moz

Local SEO
What’s happening in your local neck of the woods; well, actually in local search.
Technical SEO
Everything from basics to advanced moves (and also tools).
Content marketing
From what is helpful content to user journeys and beyond.
Evergreen-ish: Durable content for news SEO – Jessie Willms, WTF is SEO?

How to Update Old Posts for SEO: Best Practices, 5 examples …and One Big Mistake – Andy Crestodina, Orbit Media Studios

Let’s talk Content Decay – Search Off the Record, Google Search Central

Data analysis & reporting
Showing that what you’re doing is helping.
AI, machine learning, & LLMs
A section dedicated to AI news, research, and articles.
New Microsoft AI model may challenge GPT-4 and Google Gemini – Benj Edwards, Ars Technica

The World Needs Something Better Than the Transformer – Mohit Pandey, Analytics India Magazine

Why it matters: The paper on xLSTM discusses the limitations of LSTM (long short-term memory) and explains how xLSTM achieves better performance on exponential gating — replaces sigmoidal gates with exponential gating that provide a wider range of values for more fine-grained control of information flow, leading to better learning of long-term dependencies (a core challenge for sequence-based tasks like language modeling) — and modified memory structures — adjustments to how equations governing new added information and how old information is forgotten, improving the network’s ability to retain information over longer sequences, which is beneficial for tasks that require understanding context over extended periods, like the meaning of a sentence or predicting future events. This is a significant contribution to improving recurrent neural networks (RNNs), which can advance the field of deep learning and impact real-world applications.
Anthropic AI Launches a Prompt Engineering Tool that Generates Production-Ready Prompts in the Anthropic Console – Asif Razzaq and Nishant N., Mark Tech Post

Why it matters: This prompt generator “can guide Claude to generate high-quality prompts tailored to your specific tasks.” It’s currently experimental.
Why it matters: The full paper (PDF) goes through a ton of history and vocabulary that’s super helpful to understand for learning about machine learning and RAG, in particular. This says a lot about word embeddings and vector representations, which are helpful to understand for natural language processing (NLP) and semantic search, both relevant to SEO today.
“im-a-good-gpt2-chatbot” mystery – Perplexity thread

Why it matters: These models are said to be comparable or better in some ways to GPT-4, which is the underpinning of ChatGPT Pro and is rumored to use a mixture-of-experts (MoE) model, similar to Gemini 1.5, and are speculated to come from OpenAI team members. It’s a wild story.
Why it matters: This paper (from 2014) touches on important concepts related to deep learning — the underpinning of today’s LLMs and certain search engine systems, including for ranking — such as interpretability, sparking a wave of further research in the field of AI. (Related: In the first Hamsterdam Research article on the embedding language model (ELM), we touched on interpretability in the context of embeddings but also deep learning models.)
Why this matters: This course teaches practical skills for optimizing deep learning models (like LLMs). (Related, sorta: In this week’s Hamsterdam Research article on TeraHAC, there’s a top section that mentions quantization in reference to vector spaces for search engines (specifically Vertex AI).)
Excerpt: “We will highlight here how we use LLMs to extract product attributes from unstructured SKU data, allowing us to build a high-quality retail catalog that delivers the best possible experience for users in all new verticals. In the following sections, we describe three projects in which we used LLMs to build ML products for attribute extraction.”
Why it matters: It’s a fascinating read that shows the practical applications of LLMs (creating embeddings) and RAG for empowering ML models that personalize shopping experiences and recommendations, but more generally could apply to search overall. (Also h/t: Marianne Sweeny: “SEO community (and IA, content): Follow how to build knowledge graphs to better understand how search engines decide ranking. Yes, really.”)
h/t: Marianne Sweeny: “Illuminating and so important now. To where, I would who…perhaps the IA community will come around le learning more on how this data is used and extend their history in this area to solving the problem. Oh…wait..that is Next IA.”
TikTok content
It’s a search engine, right?
@private_collector Jim Reekes who worked as Apples sound designer in the 1980's with a backstory on how and why the sound was chosen. #apple #applemusic #synth #vintagesynth #mac #computer #sounddesign #fyp ♬ original sound – Turbo
@betweenusgirlies Being on a social media team is not all that fun #relatable #corporate #9to5 #socialmediamanager #betweenusgirlies @casecorradinn @bran_flakezz ♬ original sound – Between Us Girlies
@zander_whitehurst Design better sidebar navigations #ux #ui #figma #design #webdesign ♬ original sound – Zander Whitehurst • UX/UI
Humor
Subjectively funny content.
General marketing & miscellaneous
This is for great content that isn’t necessarily SEO or marketing-specific. PPC, PR, dev, design, and social friends, check it out!
And the Winner Is: Kendrick Lamar. And Old-School Hip-Hop. – NYT (Written by an anthropologist!)

Older stuff that’s good!
Not everything I find worth sharing is new as of this week, so these are gems I came across published in the past.
Excerpt: “Traditional methods, such as wrappers, suffer from limited adaptability and scalability when faced with a new website. On the other hand, generative agents empowered by large language models (LLMs) exhibit poor performance and reusability in open-world scenarios. In this work, we introduce a crawler generation task for vertical information web pages and the paradigm of combining LLMs with crawlers, which helps crawlers handle diverse and changing web environments more efficiently. We propose AutoCrawler, a two-stage framework that leverages the hierarchical structure of HTML for progressive understanding. Through top-down and step-back operations, AutoCrawler can learn from erroneous actions and continuously prune HTML for better action generation.”
Gradient descent, how neural networks learn

Great job making it to the end. You rock!
Want help with your SEO strategy?
I’m an independent SEO consultant based in Orlando, Florida, focusing on custom audits and strategies for brands. Don’t hesitate to reach out, or visit my about page for more information about me.
Let’s connect!
Hit me up anytime via text or call at 813-557-9745 or on social or email:
Cheers!
Leave a Reply