🐹 Hamsterdam Part 66: Weekly SEO & AI News Recap (7/8 to 7/14, 2024)
By Ethan Lazuk
Last updated:
A weekly look-back at SEO & AI news, tips, and other content shared on social media & beyond.

Opening notes, thoughts, and musings:
- Welcome to a new week of Hamsterdam! 🐹
- I appreciate you being here, and look forward to sharing the week’s news! 🙌
- Firstly, Eminem’s new album is a mix of new and old school madness. My wife was even reconverted as a fan. Just … be careful. 💿 🎧 😳 💀 🔥
- I’m going to try and be briefer this week. Challenge accepted. 🤝 (Update: Failed. 😂)
- Fancy a longer read? Try my other blog content from this week:
- 🤓 RAG basics with useful context for SEO, based on a talk by Jo Kristian Bergum
- Bonus: Check out my TikTok finds of the week (a little bit of everything) 💎
- Side note: I’m having trouble getting content indexed atm; TBD on seeing this in Search 🔍 (Update: We’re indexed! Just search “#1 SEO agency.” JK. 😆 ✌️)
We also deliver! — Subscribe to the Hamsterdam newsletter. Hot and ready in 30 minutes or less 🍕, or sent out every Sunday. (One of those two.)
If you’re in a rush, hop down to the news portion. 📰
Or continue reading for two vocabulary lessons, this week in SEO history, plus an introduction. (Don’t forget to click the dropdowns.)

And we’re off …
✨ Marketing word of the week: “Influencer marketing”
Influencer marketing involves a collaboration between a brand and an individual who can influence the buying decisions of their following, ideally one that aligns with the brand’s target audience, based on their perceived expertise or authenticity within a niche.
Back when I was agency side, I got my first taste of influencer marketing. I found it to be a funny process.
Perhaps what’s most funny about it is how well it works can work. 🥳
I see big brands using influencers on TikTok. The beauty and health spaces are obvious ones: “Come get ready with me while I put [endorsed] shampoo in my hair.” 💆 Or rather than a super bowl ad, they’ll have someone in pajamas eating their pizza filmed on an iPhone. 🍕 That said, tech companies use influencers, too.
In my opinion, some influencer marketing happens under the table. 🫣 The FTC has guidelines for disclosing sponsored content on TikTok, for example, but all you have to do is scroll through the comments of videos to see people questioning, “Is this an ad?” 🤨
There’s also a reverse scenario, where influencers get the attention of brands and included in their promotions, like how Google featured a comedic DJ known from TikTok as the opening act at I/O this year, likely to appeal to his audience. (Content like this is what set him off. 👈)
On the flip side, social users are savvy, imo, and they can grow tired of being sold to. I’ve seen celebrity Instagram accounts lose credibility for hawking too many products: “Their feed is just them endorsing stuff.”
That said, influencers can be straight up or even have fun with it, like Cardi B often does. (Check the cabinet.) 🎃 ☕️ 💰
The word “influencer” is kind of touchy, because we might think of people chasing clout, like by consuming laundry detergent, literally.
In reality, though, an influencer is someone with compelling expertise or experience in a knowledge domain. (So, I guess they E-E-A-T the Tide Pods. JK. 😂)
More brands today work with nano- (44%) or micro-influencers (25%) than macro ones (17%) or celebrities (13%). (Source. Also in the dropdown.)
Thanks to Discover, I also just came across this Influencity video on types of influencers that summarizes a lot of what I said but way more eloquently 😆 🙌:
🚨 ⬇️ 👈 Click here to read about possible SEO benefits and check out a few statistics about influencer marketing (if you want 😎).
Influencer campaigns can have SEO benefits, too. 🤗
Influencers’ content, if perceived as organic, can surface in search results (there’s a reason Google refers to webmasters now as “creators“). It can also drive brand awareness, which itself can lead to more navigational queries or higher trust (and then CTR) (and then probably boosted rankings), and even links or notable mentions. 🙌
I witnessed these benefits while working on cross-channel ecommerce marketing campaigns. That said, two difficulties that I saw were getting the company to send the product to the influencer on time and getting the influencer to shoot a video of a usable quality (or at all). 🤲
Serendipitously, Influencer Marketing Hub updated their 2024 benchmark report this week. It’s a 48-minute read, so you know I like it. 😆
Here are some highlights I found notable. 👀
📈 “63% plan to use AI in executing their influencer campaigns, 55% of these brands will use AI for influencer identification.”
That’s smart. Finding the right person to represent your brand is not easy. It’s not about the size of their following, but how well their followers align with your target audience, and how well the person fits your brand.
🧚 “There is a strong preference for working with small (nano – 44% and micro – 26%) influencers ahead of expensive macro-influencers (17%) and celebrities (13%).”
That supports the previous point about the right fit. There are plenty of ambitious people making cool content on social that aligns nicely with niche products or services.
💰 “60% of those respondents who budget for influencer marketing intend to increase their influencer marketing budget over 2024.”
That sounds impressive, but it’s not isolated. For instance, SEL featured a sponsored report from Ignite Visibility last month that says “82.5% of marketers plan to increase their SEO spending in 2024.”
Personally, I’d build a brand with SEO to then create evangelists who are organic influencers. Why pay for it when you can earn it for free through hard work? (JK on that last part. 😅 Creating compelling social videos is hard work. My last TikTok video has 1 view so far … 🤷)
📢 “The main purpose of running influencer campaigns is to create User Generated Content (56%). Generating sales (23%) is a distant second.”
This I love. It’s thinking that can apply broadly, too. SEO drives sales aplenty, for example, but it’s not all about lower funnel strategies. In my experience, patient brands who build up their informational assets reap long term rewards from that.
Or as Google’s past CEO Eric Schmidt says in a video in this week’s TikTok gems page, businesses should think (and act) 5 years ahead.
🧑🏫 🤖 ✅ AI word of the week: “In-context learning”
As luck would have it, this week is “I” words, and I recently learned about in-context learning (ICL) and wrote about it in a new article on RAG basics. Let’s elaborate on it here. 🙌
In-context learning refers to a large language model’s ability to learn and adapt during inference (while it’s being used) by you providing the model with examples directly within the prompt of how to respond or fulfill a task. ICL differs from explicitly training a model for a task.
Google’s ML glossary says ICL is a synonym of few-shot learning, but upon further research, I believe there’s a slight nuanced difference: few-shot learning is a technique within ICL where the prompt includes a small number of examples. Using a “few” examples is likely due the constraints of context window sizes (input tokens) or for better precision.
Why is ICL necessary? Well, since LLMs are trained on large datasets, they can’t possibly learn every specific task or nuance. Few-shot learning allows a model to adapt more quickly to new situations for greater versatility.
Here’s an example prompt for ICL from Google’s ML glossary, which “contains two examples showing a large language model how to answer a query”:

Jo Kristian Bergum (who’s talk is featured in the RAG basics article I linked earlier) published a new blog post this week for Vespa.ai about “adaptive in-context learning” (there’s also an excerpt in the AI section later), where he explains:
“In-context learning can be confusing for those with a Machine Learning (ML) background, where learning involves updating model parameters, also known as fine-tuning. In ICL, the model parameters remain unchanged. Instead, labeled examples are added to the instruction prompt during inference.”
– “Adaptive In-Context Learning 🤝 Vespa – part one,” Jo Bergum (Vespa.ai)
His article included example prompts, like the following one “with two examples from the training set”:

This illustrates prompt-based learning (or the technique of crafting specific prompts for few-shot learning ✍️), per Google’s ML glossary.
While ICL can make LLMs more adaptable and doesn’t require fine-tuning expertise, performance can also vary depending on the complexity of the task, quality of the examples, and capabilities of the model.
🚨 ⬇️ 👈 Click to continue reading about how ICL works on a more technical level (if you want 😎).
The foundation of ICL lies in the transformer architecture that powers most of today’s LLMs, specifically the mechanisms of self-attention and positional encoding.

Self-attention (red parts above) allows the model to weigh the importance of different words in the input (prompt) based on their context when generating the output (response). This enables the model to focus on the relevant parts of the examples provided during ICL.
Positional encoding helps the model understand the order of words, which is important for interpreting the structure and meaning of the provided examples.
Using its broad understanding of language patterns, syntax, and semantics (and even reasoning abilities) from the pre-training it received on massive datasets, the LLM can then recognize and generalize patterns from the ICL examples.
As Gemini Advanced also pointed out, ICL is still an active area of research, but there’s one hypothesis that says LLMs implicitly learn a wide range of tasks during pre-training, so presenting the model with examples (ICL) allows it to activate the relevant “subnetworks” within its vast parameter space to perform the specific task.
But, as mentioned, how your prompt is crafted plays an important role in guiding the model’s behavior.
Other potential drawbacks with ICL are inconsistency (especially on complex tasks or ones with limited datasets), bias (which can be inherited from the training data and manifest in ICL behavior), and a lack of explainability (or not knowing why a model predicted what it did).
Still, ICL is a promising area that has the potential to revolutionize how we interact with LLMs, especially given larger context windows that allow for more numerous or complex examples.
This week in SEO history: AltaVista retires (2013)
AltaVista was a pioneering search engine created by researchers at DEC (Digital Equipment Corporation) that launched publicly on December 15th, 1995, with an initial index of around 16 million webpages.

AltaVista’s multi-threaded crawler named Scooter could perform multiple tasks simultaneously, making it much faster at crawling and indexing webpages than other crawlers of the era. 🌪️
📝 We mentioned transformers in the last section. Standard transformer architectures use multi-headed attention, which focuses on different aspects of a complex problem (input sequence) simultaneously, analyzing the problem from each head’s unique perspective and expertise. Thus, parallel processing, where multiple sources aggregate information, applies in both contexts, resulting in a more efficient and holistic approach.
Combined with its powerful 64-bit Alpha processors (explained more in the dropdown), AltaVista was considerably fast. By 1996, it was handling 19 million request per day (check out the old-school source Perplexity found for that stat), making it one of the most popular search engines.
However, the rise of Google in the early 2000s gradually diminished the popularity of AltaVista (Google Trends chart in the dropdown).
During this week in 2013 (July 8th), the search engine was shut down. 🫡
You can still find AltaVista.com in search results, but it 301 redirects to Yahoo!:

It still gets a respectable number of organic hits, though, per Ahrefs:

🚨 ⬇️ 👈 Click to learn more about AltaVista’s history (with an interesting and speculative Google Trends finding) and how it was a pioneer in the search space (if you want 😎).
AltaVista switched ownership hands a few times. 🤝
In 1998, it was acquired (along with DEC) by Compaq (the first brand of desktop computer my family bought, at a Costco in Helena, Montana). In 1999, 83% of AltaVista was acquired by CMGI. Then in 2003, Overture purchased the search engine for $140 million. Later that year (2003), Yahoo! acquired Overture (and AltaVista).
With Google’s rise in the early 2000s, AltaVista’s use waned (but Virginia seemed to really like it, for some reason … ⬟ 🕵️ 🤷):

AltaVista was a search pioneer in several ways. 🤠
It was the first full-text database, so unlike earlier search tools that referenced specific directories or categories (information like titles and summaries), AltaVista indexed the complete text of webpages (and organized them in a structured database), enabling users to search for info buried deep within pages, a major breakthrough that made it easier to explore the web.
It also had a simple and clean interface, with a single search bar, making it more intuitive, especially for the era.
Additionally, AltaVista used 64-bit Alpha processors (a DEC creation) that were cutting-edge at the time and gave the search engine a significant speed advantage for processing queries and larger volumes of searches, making it more scalable.
📝 Recall that in Hamsterdam Part 64’s history section, we featured the N64 game console, released in 1996 and so named for its 64-bit CPU and GPU (though it mostly ran on 32-bit data — blame marketing). GPUs also power many of today’s machine learning models, but the architecture is significantly more powerful. Even still, the concept of using specialized hardware to accelerate complex computations is central to the advancements demonstrated by AltaVista, N64, and today’s ML models.
Speaking of ML today, AltaVista was also the first search engine to allow natural language queries.
It also introduced multilingual searches and was among the first to launch image, video, and audio search capabilities.
Truth be told, I never used AltaVista (that I can recall), but looking back, we can see how its innovations in the ’90s contributed to where search would later evolve to. 🤗
🤙 Introduction to week 66: Understanding
I had two experiences this week that impacted me.
The first was on Thursday night, when I watched a 2016 comedy documentary called “Jeff Ross Roasts Cops.” 🎤 👮
Now, these recaps are called “Hamsterdam” partly as an homage to The Wire.
What I love about The Wire are the representations: police, gangsters, politicians, teachers, elites, blue collar workers, people on the street, families of all styles, and on and on.
What they all have in common is their humanity and surroundings.
One of my favorite lines from the series (S4.E9) is:

I knew there was a reason I identified with it, which I just realized.
Last week, I wrote a post about on-page SEO that referenced mindfulness meditation (which I practice).
It also referenced Jon Kabat-Zinn, an influencer (😉) in the space, and someone whose work the person who introduced me to mindfulness (a therapist and former pro golfer 🏌️) respected greatly.
As it turns out, JKZ wrote a book of the same title as that quote from The Wire:

The overarching theme, as I’d explain it, is that, while our circumstances may change, we still have to confront ourselves (thoughts and feelings), omnipresent aspects of human nature, and other things that’ll always be beyond our control.
Early in that Jeff Ross comedy documentary, for example, he performs for a squad room full of officers, and they’re stern faced. No laughs.
The reason, aside from his bad jokes (JK 😆), is that they’d seen a photo of him at a BLM protest just before he’d arrived, and they felt a certain type of way.
He broke through slowly and later did a great show at a police benefit, because Jeff Ross is a goat of a human 🐐, but it made me think about how, regardless of our intentions, not everyone is going to get something at first, or maybe at all.
Which leads to the second experience the following day (Friday), when I was walking and listening to Eminem’s new album, The Death of Slim Shady (Coup De Grâce).
Two minutes in, and I felt reminiscences of being back in elementary school, bumping The Slim Shady LP from a bootleg tape made on a friend’s boombox 📻 (because my parents wouldn’t buy it 😂).
To give you an idea, there’s a “Breaking News” skit on the album with a line that goes, “Eminem has released an album in which he is actually trying to cancel himself.” (Update: Didn’t work. 😉)
Some will love the album. Others won’t, either because they won’t understand it, or because they will but dislike it all the same.
And that’s how it should be. 👀
As Mike Tyson once said:

Of course, he said “man,” but it’s universal advice 🌏, which even applies to brands, I’d argue.
What does this have to do with SEO, though? 🧐
Well, our field evolves constantly. We have information flying all around us at any given moment — new facts, findings, or opinions.
It’s a big part of what makes SEO fun, imo 🤗, but it can also feel like a barrage and get overwhelming, especially while trying to balance client or company work, personal learning, professional growth, and life beyond it all. 🥵
It’s not only SEOs that can feel this way, though.
AI is evolving so rapidly that many fields are feeling the pressure to stay up on it all. (Try counting how many new ML papers get published in an average week. 😅)
👉 I’ve found the easiest way to navigate everything, for me, is by picking a lane, following a routine, including a choice group of resources or support system, and staying true to core principles. Of course, being adaptable counts for a lot, too. 🙌
As for understanding, there’ll be aspects of what we do that may get misunderstood by some, either temporarily or permanently, but the ones who get it, those’ll be the people who’ll help establish our foundations — who’ll literally “rock” with us — and that’s the solid ground from which we can build something cool that’s ours to admire, celebrate, and share.
Celebrate understanding, just don’t sweat it if it doesn’t come 100% of the time.
That’s how it should be. 🔥
Buckle up for a full week’s recap, and enjoy the vibes (Black Sabbath, but starting with a side that’ll maybe surprise those unfamiliar with their deeper catalog 🤘):

Thank you for supporting Hamsterdam and the cause of SEO & AI learning. 🙏
Missed last week? Don’t worry, I got you! Read Part 65 to catch up.
🌟 Other great sources of weekly SEO news:
- The SEO Weekly – Garret Sussman, iPullRank
- SEOFOMO – Aleyda Solis
- Weekly Video Recaps – Barry Schwartz, SER
- Weekly SEO News YouTube channel – Olga Zarr, Seosly
- Niche Surfer – Yoyao Hsueh
Time for our weekly review of SEO social posts, articles, & more …
⚡️ Quick summary:
- Google is going more global, with a country labels test and expanded languages for translated search results.
- Shipping and returns info is now configurable in GSC
- 🔥 Pick of the week: Google AI Overviews: Do Ranking Studies Tell the Whole Story? by Rich Sanger — the more we understand how things work, the better we can serve users, imo
- 🐿️ Sneaky pick: Adaptive In-Context Learning 🤝 Vespa – part one by Jo Kristian Bergum — the more I follow his content, the better off I am
- 🧨 Dynamite bonus pick: Implementing ‘From Local to Global’ GraphRAG with Neo4j and LangChain: Constructing the Graph by Tomaž Bratanič — this is the quality of content I hope to get Hamsterdam Research to; think RAG + knowledge graphs + LLM-powered graph augmentation = GraphRAG
- And much more!
⏩ Jump to a section:
- News, Google updates, & SERP tests
- SEO tips & tidbits
- Fundamentals & resources
- Articles, videos & case studies
- Local SEO
- Technical SEO
- Content marketing
- Local SEO
- Data analysis & reporting
- AI, LLMS, & machine learning
- Miscellaneous & general posts
- Older stuff that’s good!
Or keep scrolling to see it all. ⏬
Now, let’s step inside the white flags of Hamsterdam …

📰 SEO news, Google updates, SERP tests & notable posts
Notable updates or news related to Google Search or related SEO topics (aka, the Barry section).
🍟 SEO tips & tidbits
Actionable tips, cool tidbits, and other snackable findings and observations that can be teaching moments.
Nice find, Olaff! 💐
Paper excerpt: “We study the performance of both fine-tuned LMs, and in-context few-shot LLMs … We fine-tune different sizes of T5 models … We also fine-tune FLAN instruction-tuned T5 models … We evaluate variants of few-shot PaLM2 … We construct a dynamic prompt that collects examples for each test question 𝑞𝑖 by retrieving the most similar questions from the QuoteSum training set according to Sentence-T5 embedding cosine similarity (QSum). We also experiment with the ALCE prompt that uses fully-abstractive answers with in-line citations …” (arXiv link. Contributors: Tal Schuster, Adam D. Lelkes, Haitian Sun, Jai Gupta, Jonathan Berant, William W. Cohen, Donald Metzler)
Why it matters: SEMQA is a new task that existing LLMS can perform (mentioned above are models (T5, PaLM2, etc.) and methods, including fine-tuning and in-context learning, like our AI vocab this week 🙌). Instead of just summarizing information, or copying it word-for-word, SEMQA combines both approaches, extracting factual statements (quotes) directly from the original sources and combining them with AI-generated words and phrases. 👉 A version of this was kind of what I was hinting Perplexity should do by including quotes from their interviews. AI-generated text needs human intervention once in a while, imo. 😇
Here’s an example of SEMQA from the paper (Figure 1):

Essential information, concepts, or resources to learn about SEO or AI.
Excerpt: “Information gain refers to a score that indicates the additional information included in a document beyond the information contained in documents previously viewed by a user. … Techniques involve data from documents being applied across a machine learning model to generate an information gain score, assisting in presenting documents to the user in a manner that prioritizes those with higher scores of new information.”
Longer-form content pieces shared on social, in newsletters, and elsewhere.
Critical SERP Features of Google’s shopping marketplace – Kevin Indig, Growth Memo

I Disavowed Every Link To My Website. Here’s What Happened – Cyrus Shepard, Zyppy

Excerpt: “Every influencer and their mother has mentioned how waking up at 4 am has changed their lives. I follow Alex Hormozi a LOT, and he talks about this thing almost every time. The basic idea that I got from him is that you get a 4-hour block that you should use to do your own work before doing any other work. … I started this on 18th June btw and it lasted 4 days.”
🧑💻 Technical SEO
Everything from basics to advanced moves (and also tools).
✍️ Content marketing
From what is helpful content to user journeys and beyond.
I accidentally got featured in The Guardian (here’s what you can learn from it) – Mark Rofe, The Digital PR Newsletter #26 (but using LinkedIn version)


Perplexity planning revenue sharing program with web publishers next month – Emilia David, VentureBeat

📍 Local SEO
From Google Business Profiles or reviews and more!
📊 Data analysis & reporting
Showing that what you’re doing is helping.
5 pro tips in Google Sheets from a veteran finance leader – Eric Refuerzo, Google Workspace

🤖 AI, machine learning, & LLMs
News related to models, papers, and companies.

Adaptive In-Context Learning 🤝 Vespa – part one – Jo Kristian Bergum, Vespa.ai

Google DeepMind Introduces a Parameter-Efficient Expert Retrieval Mechanism that Leverages the Product Key Technique for Sparse Retrieval from a Million Tiny Experts – Aswin Ak, MarkTech Post

Why it matters: PEER is a new architecture for MoE models (which enhance LLM capabilities, such as in Gemini 1.5) that allows for efficient scaling to a massive number of experts. This could lead to more powerful and accessible AI models that could adapt continuously through lifelong learning. In theory, such advancements to LLMs could also lead to more sophisticated search algorithms. (View paper on arXiv. Contributors: Xu Owen He)
Microsoft Drops From OpenAI’s Board – Perplexity Team

🤔 General marketing & miscellaneous
This is for great content that isn’t necessarily SEO or marketing-specific. PPC, PR, dev, design, and social friends, check it out!
Progressive Web Apps (PWA): A Comprehensive Guide – Kasie Udoka Success, Dev.to

17 Libraries to Become a React Wizard 🧙♂️🔮✨ – Anmol Baranwal, CopilotKit, Dev.to

💎 Older stuff that’s good!
Not everything I find worth sharing is new as of this week, so these are gems I came across published in the past.
All Killer No Filler: Metrics That Matter in Local SEO [Video] – Jessie Low, Kick Point Playbook
![All Killer No Filler: Metrics That Matter in Local SEO [Video]](https://i0.wp.com/ethanlazuk.com/wp-content/uploads/2024/07/local-seo-metrics.webp?resize=642%2C1200&ssl=1)
Total noob’s intro to Hugging Face Transformers – Andrew Jardine, Hugging Face

Streamlining Embedding with LangChain: Harnessing Hugging Face Without Downloads – Charan H U, Medium

Great job making it to the end. You rock! 🪨
🤝 Want help with your SEO strategy?
I’m an independent SEO consultant focusing on custom audits and holistic strategies for brands. Don’t hesitate to reach out, or visit my about page for more information.
Let’s connect!
Hit me up anytime via text or call at 813-557-9745 or on social or email:
Cheers! ✌️
Leave a Reply