Ethan Lazuk
-
🐹 Hamsterdam Part 67: Weekly SEO & AI News Recap (7/15 to 7/22, 2024)

A weekly recap of SEO and AI related news, tips, and other content shared on social media and beyond from 7/15 to 7/21, 2024.
-
🇰🇭 Addressing the Gap: How the Khmer Semantic Search Engine (KSE) Works & What That Can Teach Us as SEOs 🙌

We’ll explore Khmer Semantic Search Engine (KSE), which utilizes advanced semantic matching techniques to improve the search experience for Cambodians and other Khmer speakers.
-
🐹 Hamsterdam Weekly TikTok Gems: Bonus Content for Part 66 (7/8-7/14, 2024) 💎

This is the TikTok content that I found most valuable this week (7/8 to 7/14, 2024) but didn’t have room in Hamsterdam Part 66 to include.
-
🐹 Hamsterdam Part 66: Weekly SEO & AI News Recap (7/8 to 7/14, 2024)

A weekly recap of SEO and AI related news, tips, and other content shared on social media and beyond from 7/8 to 7/14, 2024.
-
What I Learned from “Back to Basics for RAG w/ Jo Bergum” That’s Useful Context for SEO Today

What I learned from a recent talk about RAG basics from Jo Kristian Bergum about IR systems that I found helpful from an SEO perspective.
-
🐹 Hamsterdam Weekly TikTok Gems: Bonus Content for Part 64 (6/24-6/30, 2024) 💎

This is the TikTok content that I found most valuable this week (7/1 to 7/7, 2024) but didn’t have room in Hamsterdam Part 65 to include.
-
🐹 Hamsterdam Part 65: Weekly SEO & AI News Recap (7/1 to 7/7, 2024)

A weekly recap of SEO and AI related news, tips, and other content shared on social media and beyond from 7/1 to 7/7, 2024.
-
How Does Perplexity Work? A Summary from an SEO’s Perspective, Based on Recent Interviews

Perplexity is an “answer engine,” not a search engine. Curious how it works? I looked at several interviews from Perplexity team members for this summary.
-
Summer “SLaM” & “CoSMo” Kramer: Investigating “Compressing Search with Language Models,” a Google Research Paper, & Why SEOs Should Care (Probably)

We’ll explore SLaM and CoSMo from a Google paper, “Compressing Search with Language Models,” and implications for SEOs in this Hamsterdam Research post.
-
Comparing a Verywell Mind Page (YMYL Topic) in 2017 vs. 2024 to See How On-Page SEO Evolves (& Its Limits), a Hamsterdam History Lesson

Comparing the on-page SEO of a Verywell Mind page on mindfulness meditation in 2017 and 2024 to see how SEO evolves, a Hamsterdam History post.
-
🐹 Hamsterdam Weekly TikTok Gems: Bonus Content for Part 64 (6/24-6/30, 2024) 💎

This is the TikTok content that I found most valuable this week (6/24 to 6/30, 2024) but didn’t have room in Hamsterdam Part 64 to include.
-
🐹 Hamsterdam Part 64: Weekly SEO & AI News Recap (6/24 to 6/30, 2024)

A weekly recap of SEO and AI related news, tips, and other content shared on social media and beyond from 6/24 to 6/30, 2024.
-
🦄 “Unicorn” Clicks & the Magic of Etymology for SEO Research, a Hamsterdam Marketing Lesson

What is a “unicornClick” attribute in the API Content Warehouse? In Hamsterdam Marketing, we’ll explore the word’s origins (or etymology). Buckle up!
-
Hamsterdam Part 63: Weekly SEO & AI News Recap (6/17 to 6/23, 2024)

A weekly recap of SEO and AI related news, tips, and other content shared on social media and beyond from 6/17 to 6/23, 2024.
-
Stumbling Upon Google Engineer Ni Lao’s Work, & Exploring What It Can Teach Us about ML, IR & NLP for SEO Insights (a Hamsterdam Research Post)

In this Hamsterdam Research post, we look at the work of Ni Lao, a Google engineer, for learnings about machine learning, information retrieval, and NLP.
-
Language, Art & Culture: SEO Lessons from a Cultural Anthropology Textbook (for Keywords, Content, Images & Page Experience)

This Hamsterdam Marketing lesson examines keyword, content, image, and page experience lessons for SEO gleaned from a cultural anthropology textbook.
-
Hamsterdam Part 62: Weekly SEO & AI News Recap (6/10 to 6/16, 2024)

A weekly recap of SEO and AI related news, tips, and other content shared on social media and beyond from 6/10 to 6/16, 2024.
-
Everything You Need to Know (Or A Lot of It, Probably) from Google Search Central’s SOTR Podcasts (as Told Through Gemini Prompted with 75 Transcript PDFs)

I put 75 transcripts from Google Search Central Search Off the Record (SOTR) podcast episodes into Gemini 1.5 Pro and asked questions. Here’s how it went.
-
Why Vanessa Fox’s 2008 Interview with Eric Enge About User-First & Holistic SEO is Still Relevant Today (and Tomorrow), a Hamsterdam History Lesson

In this Hamsterdam History lesson, we explore the holistic and user-first SEO principles from Vanessa Fox’s 2008 interview with Eric Enge.
-
Epistemic vs. Aleatoric Uncertainty in LLMs, via a Google DeepMind Paper, “To Believe or Not to Believe Your LLM,” & Why SEOs Should Care (Likely)

In this Hamsterdam Research, we’ll explain Google DeepMind’s paper “To Believe or Not to Believe Your LLM” and why it’s relevant to SEOs (likely).