Hamsterdam Part 58: Weekly SEO & AI News Recap (5/13 to 5/19, 2024)
By Ethan Lazuk
Last updated:
A weekly look-back at SEO & AI news, tips, and other content shared on social media & beyond.

Opening notes:
- Welcome to another week of Hamsterdam! (So much stuff this week, I had to remove like 40% of it for mobile loading …)
- We hit no. 1 on Friday for “weekly SEO news” on Google. Pretty cool, right?! (Of course, we dropped to page 2 a few days before, so who knows?!)

- I’ve included a special I/O section but put emphasis on other topics — including TikTok content and an amazing ML/DNN video in the older-but-still-good section.
- In other Hamsterdam news …
- I wrote about neural network terminology in an SEO context in Hamsterdam Research, good for machine-learning basics.
- In Hamsterdam History, I revisited an interview Jeff Dean (Google DeepMind) gave in 2013 about neural networks and compared it to I/O news.
- I also want to have more general marketing content, so I created Hamsterdam Marketing, where we delved into personas for SEO as our first topic.
- On that note …
- I’ve also added two vocabulary sections to Hamsterdam recaps! One of my favorite parts of my Google Discover feed are words of the day, so I’ll be doing that here for marketing and AI terms.
Want each week’s Hamsterdam recap delivered? Subscribe to the free newsletter! (It’s pretty much a link to this article, but it’ll be conveniently emailed to you.) 😉
*Feel free to jump to the news recap below, or continue reading for words of the week, “This week in SEO history,” plus an introduction and summary first!
Marketing word of the week: “above the fold”
Above the fold refers to the visible section of a webpage that loads in the viewport before a user scrolls down.

Earlier newspapers especially were printed on large sheets of paper and folded to sell at newsstands. This meant stories above the fold were seen first by passersby, so their headlines had to grab attention!

The same applies to webpages today. (There’s a TikTok video on this topic later in the recap.)
You generally want to include your primary heading (typically an H1), so users know what content to expect.
You might also have a CTA (call-to-action) above the fold, if your visitors have clear intent to convert. Whereas uncertain visitors might want supporting text, imagery, or navigational elements or jumplinks to learn more.
In Google’s Core Web Vitals (CWV), largest contentful paint (LCP) represents the render time of the largest content element in the viewport (above the fold).

While it’s a best practice to lazy load images below the fold (waiting to load them until they’re scrolled into the viewport), above-the-fold content should load as quickly as possible. (A good LCP is under 2.5 seconds.)
Users having a satisfying above-the-fold experience is more likely to improve your metrics like dwell time, time on page, and engagement rates.
You can also measure depth of scrolling with heat maps, like in Microsoft Clarity, which shows an “average fold” dotted line.

If you monetize your website, beware of overly aggressive ads above the fold. Not only can these slow a page’s load time, but they can turn off users by diminishing the page experience.
You’l also want to be conscientious of how your title links correspond to your primary headings (H1s). If users expect one topic or angle from search results but encounter another on the page itself, they may bounce in frustration.
Overall, the better your above-the-fold experience, the more positive user interaction signals you’ll send back to search engines regarding the relevance and quality of your content, not to mention conversions.
AI word of the week: “activation function”
An activation function is a mathematical operation that takes an input and produces an output. In neural networks, it helps the model to learn nonlinear (complex) relationships between features (or individual characteristics or attributes of the input data that the model uses to make predictions) and a label (or the target output or correct answer the model is trying to predict).
Neural networks are made up of an input layer with one or more hidden layers with interconnected neurons that feed forward to an output layer. The activation function is applied to the weighted sum of the inputs to neurons in the hidden layers. Below is an example of a feed-forward neural network with a Sigmoid activation function.

The concept of nonlinear relationships is important to grasp in machine learning. Many real-world phenomena exhibit complex relationships that linear representations (such as linear regression) can’t capture. This is what makes neural networks valuable for modeling these intricate patterns and dependencies — especially deep neural networks (DNNs) for tasks like image recognition and natural language processing (NLP) — and activation functions are key here.

Common examples of activation functions include Sigmoid, tanh, and ReLU, shown on the left side below.

As a takeaway, activation functions are used within the hidden layers of neural networks and allow the models to capture intricate (nonlinear or complex) connections between input data (features) and the desired output (label).
This week in SEO history: “Line Mode Browser” (1991)
Line Mode Browser (which is formally called the Libwww Line Mode Browser) was introduced on May 14th, 1991 by a team that included Tim Berners-Lee (the creator of HTML, URI, HTTP, and the first browser, “WorldWideWeb”), Henrik Frystyk Nielsen, and Nicola Pellow.
At least, that’s the date according to Web Design Museum. Other sources say January 1992 was the rollout. Digging in, I think May 1991 was when there was a general release of WWW on central CERN machines.
Line Mode Browser was the second browser ever made for the world wide web, and the first with a cross-platform codebase (Nicola Pillow’s innovation). This meant it could be installed on different kinds of computers (not just the proprietary NeXT operating systems, like the WordWideWeb was made for).
To accomplish this cross-platform use, the browser displayed only text. A mouse couldn’t be used, so links were visited using keyboard inputs.
What’s awesome is there’s a simulator available for Line Mode Browser.
Below you can see numbers in brackets (like “[14]”), which are links to documents, called “references.” To visit the May history document in the example below, we’d type in “14” and hit RETURN. Then the page prints text on the screen line by line, like a teleprinter. It’s a fun, Matrix-like experience.

Here’s a part SEOs might find interesting, as well.
According to the WWW Line Mode Browser quick guide (which is still online), there was a search function based on keywords:
“Some documents are indexes. These contain little text, but allow you to search for information with keywords. Type ‘find’ or ‘f’ (space) and the keywords. For example, ‘f sgml examples’ searches the index for items with keywords SGML and EXAMPLE. You can only use the ‘find’ command when it is present in the prompt. You can omit the ‘f’ if the first keyword doesn’t conflict with existing commands .”
– WWW Line Mode Browser quick guide
Pretty cool!
Speaking of developments in search …
Let’s get to our introduction this week, talking about AI, but not how you might think.
Introduction to week 58: “blank stare banker”

You’ve probably heard a lot about Google I/O this week. Combined with other announcements, it might even seem like AI is taking over everything we talk about these days.
I have a story that shows that might not be the case … though I think it should be.
But first, let’s recap the week a bit.
Google I/O was on Tuesday (May 14th). Here’s a list of the top 100 announcements. There were others, as well (like Model Explorer).
Google introduced several search related features. The main announcement was the rollout of AI Overviews (formerly SGE via Search Labs) to all U.S. users now and 1 billion people globally this year.
Another update was video queries (video inputs via Google Lens with AI Overview responses), although this is a coming Search Labs feature. Personally, I see video inputs as more germane to Gemini (say that 5 times fast) than search, but we shall see!
The sneaky big news for search, in my opinion, was AI-organized result pages, which are personalized to users. This is already live for all U.S. English searches. My expectation is it’s the truer future direction of Google Search.
If you visited followed topics from Discover in Search and used the About this result feature, you likely saw how widespread personalization was already. (That trend also goes back decades.)
While using generative AI to summarize search results is helpful for some uses (I reference them often, tbh), I see AI Overviews more as a Perplexity response that’s most pertinent to early adopters.
Meanwhile, personalizing an entire SERP with a variety of media and search journey categories, so users can “slice and dice” or “explore” (remember those terms from our 2009 Searchology history lesson of “Search Options”), that feels more universally helpful and in line with Google’s bigger goal of eliminating friction points.
Personally, I didn’t see anything that changed my approach to SEO strategies, other than needing to update “SGE” mentions in my service pages to say “AI Overviews.” 😉
It was all pretty much in line with expectations, and I’m personally excited about an era of multimodal search that crosses multiple surfaces. (Attribution will get improved over time, I’m sure.)
But I do think the nature of SEO workflows is changing quickly.
The big news for consumers at I/O, I thought, were the upgrades to Gemini Advanced (like data analysis, 1.5 Pro, and 1 million token context window) and the demoed “agentive” experiences, a carry over from what we saw at Cloud Next last month.
As OpenAI’s Sam Altman said in an interview (which you can find below in the TikTok section), asking Siri on iPhone to set an alarm is much easier than doing it manually. However, shopping on DoorDash or ordering an Uber is still something most would prefer to do themselves.
But digging through email to find a receipt? Yeah, you got that one, Gemini!
I also thought big news overall were Gemini 1.5 Flash (which in testing so far is super fast) and expanding the context window of 1.5 Pro to 2 million tokens. Rather than a custom RAG at that point, just upload your documents and carry on.
But why did I call this introduction “blank stare banker”?
Well, for all the excitement this week at OpenAI and from Google I/O (and we also have Microsoft Build coming up!), I also got a reminder that kept things in context.
I was at Wells Fargo on Monday (during the GPT-4o announcement) to open a business account, when the banker (a talkative young person) told me, “You have the wrong role assigned on your state business form.”
It was something like that. I don’t recall exactly.
And I said, “Oh, you know what, I think I just asked ChatGPT which option to pick for that.”
They went quiet and stared blankly, and that’s when I realized, this smart, capable person had no idea what ChatGPT was.
I wasn’t shocked, truth be told. I hear similar from other professionals, that they’re “holding off” before getting involved with AI.
This is just my opinion, but I think everyone needs to drop what they’re doing and get involved right now.
To what degree is up to you.
But there’s a learning curve with AI, especially the limits of its operations.
AI models can save us a ton of time, or they can create extra work — like explaining when they can’t do SEO. 😉
Learning that takes time.
Outside of using AI for SEO work efficiencies, I enjoy studying deep neural networks from an academic perspective to learn how the models work.
For other professionals, it might be beneficial to get familiar with the different companies creating models, and the types of products available.
As you’ll hear in a few videos below, some companies are skipping tools like Copilot (due to cost) and making their own custom ones with open-source models (like Meta’s Llama 3). Meanwhile, the rate of new model deployment is expected to be about every 4-6 months.
Just as SEOs advise clients on digital marketing, so too can we give suggestions for AI usage based on us having a more in-depth familiarity.
AI has already been a part of most aspects of society, with some of these topics, like neural networks, going back to the 1960s. Yet it will soon be more ubiquitous from a consumer and professional perspective.
I wouldn’t wait for anyone to show you the way. Instead, take the bull by the horns and go explore! Paying attention is all you need. 🙂
One place to start getting inspired is Google Labs. Another is on YouTube, where the quality levels of DNN lectures and free tutorials for Python, PyTorch, etc. is refreshing.
You might be surprised just how quickly you can accomplish your goals, or even discover new ones. 😉
Buckle up for a full week’s recap, and enjoy the vibes:
Thank you for supporting Hamsterdam and the cause of SEO & AI learning.
Missed last week? Don’t worry, I got you! Read Part 57 to catch up.
Other great sources of weekly SEO news:
- The SEO Weekly – Garret Sussman, iPullRank
- SEOFOMO – Aleyda Solis
- Weekly Video Recaps – Barry Schwartz, SER
- Weekly SEO News YouTube channel – Olga Zarr, Seosly
- Niche Surfer – Yoyao Hsueh
Now, time for our weekly review of SEO social posts, articles, & more …

Quick summary
- ChatGPT GPT-4o released (natively multimodal); Sam Altman called it “her,” which stirred excitement; mystery chatbot solved?
- Google I/O happened; AI Overviews rolled out (U.S.); sneaky big news is AI-organized results pages; Gemini Advanced to get 1.5 Pro and 1 million token context window
- Google launches web filter; JM speaks about HCU-impacted sites
- Perplexity’s head of search gives interview; CEO trolls Google; Microsoft does, too
- And much more!
Jump to a section of this week’s recap:
- I/O news
- News, Google updates, & SERP tests
- SEO tips & tidbits
- Fundamentals & resources
- Articles, videos & case studies
- Local SEO
- Technical SEO
- Content marketing
- Data analysis & reporting
- AI, LLMS, & machine learning
- TikTok section
- Humor
- Miscellaneous & general posts
- Older stuff that’s good!
Or keep scrolling to see it all.
Ok, time to step inside the white flags of Hamsterdam …

I/O news
Some stuff Google shared on social during I/O 2024.
You can watch the full I/O keynote on YouTube here.
SEO news, Google updates, & SERP tests
Notable updates or news related to Google Search or related SEO topics.
Circle to Search may no longer be an Android exclusive, could come to Chrome on iOS – Ryan McNeal, Android Authority

SEO tips & tidbits
Actionable tips, cool tidbits, and other findings and observations that can be teaching moments.
SEO (and AI) fundamentals & resources
Essential information, concepts, or resources to learn about SEO or AI.
Articles, videos, case studies & more
Longer-form content pieces shared on social, in newsletters, and elsewhere.
Google is redesigning its search engine — and it’s AI all the way down – David Pierce, The Verge

Too often, I meet with SEOs and businesses whose approach is backward. They start off saying, “I have this thing. Make it rank for this keyword.” That’s the wrong approach. A better approach is to start with the keyword, understand the user intent and what they would find useful – and then go build that.”
Alexandr Yarats, Head of Search at Perplexity – Interview Series – Antoine Tardif, Unite.AI

Local SEO
What’s happening in your local neck of the woods; well, actually in local search.
Technical SEO
Everything from basics to advanced moves (and also tools).
Content marketing
From what is helpful content to user journeys and beyond.
Data analysis & reporting
Showing that what you’re doing is helping.
AI, machine learning, & LLMs
Why it matters: 1-bit transformers are a new type of neural network architecture that replace 32-bit weights with 1-bit weights, making them smaller and more efficient. This leads to benefits like running on devices with less power and memory. These transformers could be used to develop more powerful chatbots or virtual assistants or even new applications like real-time machine translation or on-device NLP for mobile devices.
Building on our commitment to delivering responsible AI – Lila Ibrahim & James Manyika, The Keyword (Google)

Why it matters: AlphaFold 3 is BIG TIME. Also, the advancements in the education field, I think, will revolutionize opportunity, especially for people who like self-learning. 😉 Spread the word!
How ‘Chain of Thought’ Makes Transformers Smarter – Vineet Kumar, MarkTech Post

Why it matters: The paper explains the theoretical reasons why chain of thought (CoT) is effective for improving reasoning capabilities of LLMs. CoT involves prompt engineering to instruct LLMs to think step-by-step. It can be as simple as adding “think step by step” to the end of a prompt. This encourages the model to break down a problem into smaller sequential steps. Without CoT, transformers did parallel computations, or processing different pieces of information simultaneously, while struggling at reasoning sequentially, where one step depends on the outcome of another. As the article points out, CoT makes transformers powerful enough for any computationally hard problem, theoretically. Here’s a link to the paper. This was applied to “decoder-only transformers through the lens of expressiveness,” meaning generative tasks relevant to LLMs.
This AI Paper by Microsoft and Tsinghua University Introduces YOCO: A Decoder-Decoder Architectures for Language Models – Nikhil, MarkTech Post

Why it matters: YOCO is highly efficient at processing long text sequences, like would be needed for document understanding, code generation, or conversational responses. In short, it’s a new take on a language model that combines the benefits of, say, GPT and BERT. Here’s a link to the paper. The researchers note how, “Experimental results demonstrate that YOCO achieves favorable performance compared to Transformer in various settings of scaling up model size and number of training tokens.” And given the 1 million token context window of Gemini, it’s notable how they said, “We also extend YOCO to 1M context length with near-perfect needle retrieval accuracy.”
The future of Microsoft’s Copilot – CNBC

Why it matters: This video talks about Copilot cost limitations for businesses vs. creating custom tools with open-source models.
Why it matters: Here’s a link to the PDF. I had Gemini 1.5 Flash summarize it for us (it took under 10s and used 115k tokens, or about 12% of the 1 million available, or 1.2% of 10 million): “This paper introduces Gemini 1.5 Pro and Gemini 1.5 Flash, the next generation of multimodal language models capable of handling massive amounts of context (up to 10 million tokens), significantly exceeding the capabilities of current models like Claude 3.0 and GPT-4 Turbo. Notably, Gemini 1.5 models achieve near-perfect recall on long-context retrieval tasks across all modalities (text, video, audio) and demonstrate improved performance on various benchmarks, including long-document question answering and long-video question answering. These advancements in long-context capabilities do not come at the expense of core capabilities like math, science, reasoning, code, and multilingual understanding, with Gemini 1.5 Pro outperforming even its previous, more powerful iteration, Gemini 1.0 Ultra. While the paper highlights several real-world use cases, SEO professionals might be particularly interested in Gemini 1.5’s ability to process and analyze large amounts of data, potentially influencing the way SEO strategies are developed and executed.”
TikTok content
It’s a search engine, right?
@allinpodcastclips Sam Altman on OpenAI creating a new phone
♬ original sound – All-In Pod clips
@bloombergbusiness Never mind #movies, #television and even TikTok — #AI is going to lead to a whole "new form of content," argues #DreamWorks co-founder Jeffrey Katzenberg at the #QatarEconomicForum — #film #TV #tech #artificialitelligence #entertainment ♬ original sound – Bloomberg Business
@radwebdesigns Let’s make the perfect portfolio! After reviewing hundreds of award winning agencies and portfolio sites, I think this is the best above the fold hero section in web design. Start with clear and concise header that tells people what you do and who do you that for. It’s so easy to bloviate or try to be unique with forced copywriting. The best are simple, around 10 words, and help frame the type of work visitors are about to see below. Speaking of work, let’s talk about the things that can get web visitors to scroll: the preview! By simply showing a graphic of your design work peaking through the bottom, you can make the hero more interesting and get users to scroll. It’s easy to throw on a “scroll” label at the bottom, but don’t tell me to scroll. Show me what I’m scrolling to. Let me know what you think and if you’ll start implementing this on your agency/portfolio site! #webdesign #portfolio #webagency #portfoliodesign ♬ original sound – RadWebDesigns
@jasmine_bina I invited ethnographic researcher and brand strategist Peter Spear to come and speak to us for our series “Talks at Concept Bureau” about how to create brand mythologies. Here he talks about his favorite question to ask as a brand marketer when you’re doing user research. #brand #branding #brandstrategy #marketing #marketingtips #brandingtips #userresearch #brandrepsearch #brandmarketing ♬ original sound – Jasmine Bina 🎯 brands+culture
@verge With the help of Gemini Pro and other language models, Google believes it can finally build truly universal digital assistants that succeed where Alexa and Siri never could with Project Astra. #vergecast #googleio #ai #podcast #techtok ♬ original sound – The Verge
@dan..mbae more about JavaScript #tech #danmbae ♬ original sound – Dan mbae
Humor
Subjectively funny content.
General marketing & miscellaneous
This is for great content that isn’t necessarily SEO or marketing-specific. PPC, PR, dev, design, and social friends, check it out!
Older stuff that’s good!
Not everything I find worth sharing is new as of this week, so these are gems I came across published in the past.
Great job making it to the end. You rock!
Want help with your SEO strategy?
I’m an independent SEO consultant based in Orlando, Florida, focusing on custom audits and strategies for brands. Don’t hesitate to reach out, or visit my about page for more information.
Let’s connect!
Hit me up anytime via text or call at 813-557-9745 or on social or email:
Cheers!
Leave a Reply