🐹 Hamsterdam Part 118: Weekly SEO & AI News Recap (10/4 to 10/10, 2025)
By Ethan Lazuk
Last updated:
A weekly look-back at SEO & AI news, tips, and other content shared on social media & beyond.

Opening notes, thoughts, and musings:
- Welcome to another new week of Hamsterdam! 🐹
- I appreciate you being here, and look forward to sharing the week’s news! 🙌
If you’re in a rush, hop down to the news portion. 📰
Or continue reading for two vocabulary lessons plus an introduction.

And we’re off …
🧑💻 Marketing word of the week: “Web crawler”
A web crawler (also known as a spider or bot) is an automated program that systematically browses the internet to discover, download, and index web pages.

Crawlers work by first starting with a known list of URLs called seed URLs. Next they fetch and read pages’ HTML and follow links to discover new pages. They then store and index the content and metadata in a massive index, kind of like a library catalog of the web. Crawlers then revisit sites regularly to check for changes, ensuring freshness and accuracy.
Google uses crawlers like Googlebot, Googlebot-Image, and Googlebot-News to build their Search index. These crawlers are controlled by robots.txt files or meta tags. Meanwhile, crawling efficiency and relevance are guided by systems like Caffeine and Navboost, which decide what pages matter most. Bing uses Bingbot similarly, which also powers Copilot and ChatGPT’s Bing integration. Other crawlers include YandexBot, BaiduSpider, and DuckDuckBot.
OpenAI has a web crawler named GPTBot; it’s purpose is to crawl publicly available web content that can be used to improve or train models. There’s also OAI-SearchBot, which is used for ChatGPT’s search features. This bot is intended for discovery and indexing for ChatGPT search, not full model training.
🤖 AI word of the week: “Word2Vec“
Word2Vec is a neural-network model that converts words into continuous vector representations (embeddings). It is one of the most influential models in natural language processing (NLP). In short, Word2Vec taught computers to understand words not just as symbols but as meaningful vectors that capture relationships and context.

Word2Vec was developed by Google in 2013. Rather than understand language, Word2Vec learns from patterns of word co-occurrence in large text corpora. It uses CBOW (continuous bag of words) to predict a target word from its surrounding words as well as skip gram, which does the reverse or predicts surrounding words from a target word. Both methods train a small neural network that (once trained) yields the word embeddings.
Word2Vec is important because it captures semantic similarity and mathematical relationships and is the foundation for modern NLP, paving the way for GloVE, FastText, ELMo and transformer models like BERT and GPT.
Unlike, say BERT, Word2Vec has more limitations, such as representing one meaning per word, or not understanding different meanings of the word “bank” for example. It also doesn’t handle out-of-vocabulary words well or long-range dependencies.
😊 Introduction to week 118: “Reading”
As SEOs, we do a lot of reading typically, whether it’s understanding webmaster guidelines or keeping up to date with articles that discuss the latest news and information around search marketing.
Lately, I’ve rediscovered my passion for reading alternative materials, i.e., books!
I’ve started with some fictional novels by writers for my favorite TV show The Wire, including George Pelecanos and Richard Price. Now I’m also getting into history books.
Whereas before I typically only read “for work,” meaning SEO content, getting into reading other materials has helped me recenter myself and enjoy learning in general.
If you’re like me and haven’t picked up a book in a while, consider giving it a shot.
Thank you for supporting Hamsterdam and the cause of SEO & AI learning. 🙏
Enjoy the vibes:
Missed last week? Don’t worry, I got you! Read Part 117 to catch up.
🌟 Other great sources of weekly SEO news:
- The SEO Weekly – Garret Sussman, iPullRank
- SEOFOMO – Aleyda Solis
- Weekly Video Recaps – Barry Schwartz, SER
- Weekly SEO News YouTube channel – Olga Zarr, Seosly
- Niche Surfer – Yoyao Hsueh
Time for our weekly review of SEO social posts, articles, & more …

Now, let’s step inside the white flags of Hamsterdam …
⏩ Jump to a section:
- News, Google updates, & SERP tests
- SEO tips & tidbits
- Fundamentals & resources
- Articles, videos & case studies
- Local SEO
- Technical SEO
- Content marketing
- Local SEO
- Data analysis & reporting
- AI, LLMS, & machine learning
- Miscellaneous & general posts
- Older stuff that’s good!
Or keep scrolling to see it all. ⏬
📰 SEO news, Google updates, SERP tests & notable posts
Notable updates or news related to Google Search or related SEO topics.
🍟 SEO tips & tidbits
Actionable tips, cool tidbits, and other snackable findings and observations that can be teaching moments.
Essential information, concepts, or resources to learn about SEO or AI.
Longer-form content pieces shared on social, in newsletters, and elsewhere.
🧑💻 Technical SEO
Everything from basics to advanced moves (and also tools).
✍️ Content marketing
From what is helpful content to user journeys and beyond.
>>Check back next week!
📍 Local SEO
From Google Business Profiles to reviews and more!
📊 Data analysis & reporting
Showing that what you’re doing is helping.
>>Check back next week!
🤖 AI, machine learning, & LLMs
News related to models, papers, and companies.
🤔 General marketing & miscellaneous
This is for great content that isn’t necessarily SEO or marketing-specific. PPC, PR, dev, design, and social friends, check it out!
>>Check back next week!
💎 Older stuff that’s good!
Not everything I find worth sharing is new as of this week, so these are gems I came across published in the past.
Great job making it to the end. You rock! 🪨
Let’s connect!
Hit me up anytime via text or call at 813-557-9745 or on social or email:
Cheers! ✌️
Leave a Reply