🐹 Hamsterdam Part 130: Weekly SEO & AI News Recap (12/27, 2025 to 1/2, 2026)
By Ethan Lazuk
Last updated:
A weekly look-back at SEO & AI news, tips, and other content shared on social media & beyond.

Opening notes, thoughts, and musings:
- Welcome to another new week of Hamsterdam! 🐹
- I appreciate you being here, and look forward to sharing the week’s news! 🙌
- Happy new year to you and yours! 🎆
If you’re in a rush, hop down to the news portion. 📰
Or continue reading for two vocabulary lessons plus an introduction.

And we’re off …
🧑💻 Marketing word of the week: “Information processing”
In the context of marketing, information processing is how people notice, understand, remember, or use marketing messages to make decisions. In short, it refers to what happens in someone’s brain between them seeing a marketing message and deciding whether to care.
Information processing can be broken into five stages: 1. exposure (consumer comes into contact with a message), 2. attention (consumer notices the message), 3. comprehension (they make sense of the message), 4. memory (the message is stored (or not) in their memory), and 5. retrieval and decision (the consumer recalls the information when making a choice).
For us as SEOs, an example of information processing might be seeing a brand mentioned or a page from a brand cited in an AI Overview or assistant’s answer. Our hope is for the click and conversion, but more realistically we should hope the user acknowledges the brand’s presence and retains it to memory for future actions.
🤖 AI word of the week: “Inductive bias”
Inductive bias is the set of assumptions an AI model uses to generalize beyond training data. Put simply, AI cannot learn anything without assumptions, so inductive bias is what it assumes before it sees the data.
Inductive bias exists because data is limited, noisy, and infinite explanations could fit the same data, so without inductive bias, a model wouldn’t know which patterns to prefer. It’s known as the “no free lunch” idea in ML: no single model works best for all problems without assumptions.
Inductive bias guides a model to prefer simpler explanations over complex ones, assume similar inputs produce similar outputs, and focus on certain structures. Think of it as built-in preference.
Here’s a simple real-world example: imagine you show a model 5 photos of cats. A CNN assumes edges and shapes matter (it learns cat-like features). A linear model assumes flat relationships (it struggles). It’s the same data, but different inductive bias and therefore different results.
Why all this matters … well it explains why model choice matters, why LLMS behave differently, and it’s critical for model interpretability.
😊 Introduction to week 130: “New Year’s Resolution”
What is your new year’s resolution? Mine is to learn more about AI.
Part of what drew me to SEO as a career was the opportunity to always be learning and improving. AI assistants and search features have changed the game, and that creates both angst and excitement.
While it’s hard to see clicks decline even as content production ramps up, in some cases anyway, it’s intriguing to consider how we can influence the buyer’s journeys of modern consumers through an expanded application of SEO fundamentals.
The more knowledge we have about the underlying systems and how they work, the more we can test and iterate on our strategies.
Cheers to a great 2025 and an even better 2026!
Thank you for supporting Hamsterdam and the cause of SEO & AI learning. 🙏
Enjoy the vibes:
Missed last week? Don’t worry, I got you! Read Part 129 to catch up.
🌟 Other great sources of weekly SEO news:
- The SEO Weekly – Garret Sussman, iPullRank
- SEOFOMO – Aleyda Solis
- Weekly Video Recaps – Barry Schwartz, SER
- Weekly SEO News YouTube channel – Olga Zarr, Seosly
- Niche Surfer – Yoyao Hsueh
Time for our weekly review of SEO social posts, articles, & more …

Now, let’s step inside the white flags of Hamsterdam …
⏩ Jump to a section:
- News, Google updates, & SERP tests
- SEO tips & tidbits
- Fundamentals & resources
- Articles, videos & case studies
- Local SEO
- Technical SEO
- Content marketing
- Local SEO
- Data analysis & reporting
- AI, LLMS, & machine learning
- Miscellaneous & general posts
- Older stuff that’s good!
Or keep scrolling to see it all. ⏬
📰 SEO news, Google updates, SERP tests & notable posts
Notable updates or news related to Google Search or related SEO topics.
🍟 SEO tips & tidbits
Actionable tips, cool tidbits, and other snackable findings and observations that can be teaching moments.
Essential information, concepts, or resources to learn about SEO or AI
Longer-form content pieces shared on social, in newsletters, and elsewhere.
🧑💻 Technical SEO
Everything from basics to advanced moves (and also tools).
✍️ Content marketing
From what is helpful content to user journeys and beyond.
📍 Local SEO
From Google Business Profiles to reviews and more!
📊 Data analysis & reporting
Showing that what you’re doing is helping.
🤖 AI, machine learning, & LLMs
News related to models, papers, and companies.
🤔 General marketing & miscellaneous
This is for great content that isn’t necessarily SEO or marketing-specific. PPC, PR, dev, design, and social friends, check it out!
💎 Older stuff that’s good!
Not everything I find worth sharing is new as of this week, so these are gems I came across published in the past.
Great job making it to the end. You rock! 🪨
Let’s connect!
Hit me up anytime via text or call at 813-557-9745 or on social or email:
Cheers! ✌️
Leave a Reply