Ethan Lazuk

SEO & marketing professional.


🐹 Hamsterdam Part 66: Weekly SEO & AI News Recap (7/8 to 7/14, 2024)

By Ethan Lazuk

Last updated:

A weekly look-back at SEO & AI news, tips, and other content shared on social media & beyond.

Hamsterdam Part 66 Weekly SEO News Recap from 7/8 to 7/14 with Martin Splitt quote.
Source: SOTR Ep. 77 (transcript)

Opening notes, thoughts, and musings:

  • Welcome to a new week of Hamsterdam! 🐹
  • I appreciate you being here, and look forward to sharing the week’s news! 🙌
  • Firstly, Eminem’s new album is a mix of new and old school madness. My wife was even reconverted as a fan. Just … be careful. 💿 🎧 😳 💀 🔥
  • I’m going to try and be briefer this week. Challenge accepted. 🤝 (Update: Failed. 😂)
  • Fancy a longer read? Try my other blog content from this week:
  • 🤓 RAG basics with useful context for SEO, based on a talk by Jo Kristian Bergum
  • Bonus: Check out my TikTok finds of the week (a little bit of everything) 💎
  • Side note: I’m having trouble getting content indexed atm; TBD on seeing this in Search 🔍 (Update: We’re indexed! Just search “#1 SEO agency.” JK. 😆 ✌️)

We also deliver!Subscribe to the Hamsterdam newsletter. Hot and ready in 30 minutes or less 🍕, or sent out every Sunday. (One of those two.)

If you’re in a rush, hop down to the news portion. 📰

Or continue reading for two vocabulary lessons, this week in SEO history, plus an introduction. (Don’t forget to click the dropdowns.)

The Big Lebowski is this your homework Larry screenshot.

And we’re off …


Marketing word of the week: “Influencer marketing”

Influencer marketing involves a collaboration between a brand and an individual who can influence the buying decisions of their following, ideally one that aligns with the brand’s target audience, based on their perceived expertise or authenticity within a niche.

Back when I was agency side, I got my first taste of influencer marketing. I found it to be a funny process.

Perhaps what’s most funny about it is how well it works can work. 🥳

I see big brands using influencers on TikTok. The beauty and health spaces are obvious ones: “Come get ready with me while I put [endorsed] shampoo in my hair.” 💆 Or rather than a super bowl ad, they’ll have someone in pajamas eating their pizza filmed on an iPhone. 🍕 That said, tech companies use influencers, too.

In my opinion, some influencer marketing happens under the table. 🫣 The FTC has guidelines for disclosing sponsored content on TikTok, for example, but all you have to do is scroll through the comments of videos to see people questioning, “Is this an ad?” 🤨

There’s also a reverse scenario, where influencers get the attention of brands and included in their promotions, like how Google featured a comedic DJ known from TikTok as the opening act at I/O this year, likely to appeal to his audience. (Content like this is what set him off. 👈)

On the flip side, social users are savvy, imo, and they can grow tired of being sold to. I’ve seen celebrity Instagram accounts lose credibility for hawking too many products: “Their feed is just them endorsing stuff.”

That said, influencers can be straight up or even have fun with it, like Cardi B often does. (Check the cabinet.) 🎃 ☕️ 💰

@iamcardib

My new favorite drink, took the psl and turned it up with Whipshots

♬ original sound – Cardi B

The word “influencer” is kind of touchy, because we might think of people chasing clout, like by consuming laundry detergent, literally.

In reality, though, an influencer is someone with compelling expertise or experience in a knowledge domain. (So, I guess they E-E-A-T the Tide Pods. JK. 😂)

More brands today work with nano- (44%) or micro-influencers (25%) than macro ones (17%) or celebrities (13%). (Source. Also in the dropdown.)

Thanks to Discover, I also just came across this Influencity video on types of influencers that summarizes a lot of what I said but way more eloquently 😆 🙌:

🚨 ⬇️ 👈 Click here to read about possible SEO benefits and check out a few statistics about influencer marketing (if you want 😎).

Influencer campaigns can have SEO benefits, too. 🤗

Influencers’ content, if perceived as organic, can surface in search results (there’s a reason Google refers to webmasters now as “creators“). It can also drive brand awareness, which itself can lead to more navigational queries or higher trust (and then CTR) (and then probably boosted rankings), and even links or notable mentions. 🙌

I witnessed these benefits while working on cross-channel ecommerce marketing campaigns. That said, two difficulties that I saw were getting the company to send the product to the influencer on time and getting the influencer to shoot a video of a usable quality (or at all). 🤲

Serendipitously, Influencer Marketing Hub updated their 2024 benchmark report this week. It’s a 48-minute read, so you know I like it. 😆

Here are some highlights I found notable. 👀

📈 “63% plan to use AI in executing their influencer campaigns, 55% of these brands will use AI for influencer identification.”

That’s smart. Finding the right person to represent your brand is not easy. It’s not about the size of their following, but how well their followers align with your target audience, and how well the person fits your brand.

🧚 “There is a strong preference for working with small (nano – 44% and micro – 26%) influencers ahead of expensive macro-influencers (17%) and celebrities (13%).”

That supports the previous point about the right fit. There are plenty of ambitious people making cool content on social that aligns nicely with niche products or services.

💰 “60% of those respondents who budget for influencer marketing intend to increase their influencer marketing budget over 2024.”

That sounds impressive, but it’s not isolated. For instance, SEL featured a sponsored report from Ignite Visibility last month that says “82.5% of marketers plan to increase their SEO spending in 2024.”

Personally, I’d build a brand with SEO to then create evangelists who are organic influencers. Why pay for it when you can earn it for free through hard work? (JK on that last part. 😅 Creating compelling social videos is hard work. My last TikTok video has 1 view so far … 🤷)

📢 “The main purpose of running influencer campaigns is to create User Generated Content (56%). Generating sales (23%) is a distant second.”

This I love. It’s thinking that can apply broadly, too. SEO drives sales aplenty, for example, but it’s not all about lower funnel strategies. In my experience, patient brands who build up their informational assets reap long term rewards from that.

Or as Google’s past CEO Eric Schmidt says in a video in this week’s TikTok gems page, businesses should think (and act) 5 years ahead.


🧑‍🏫 🤖 ✅ AI word of the week: “In-context learning”

As luck would have it, this week is “I” words, and I recently learned about in-context learning (ICL) and wrote about it in a new article on RAG basics. Let’s elaborate on it here. 🙌

In-context learning refers to a large language model’s ability to learn and adapt during inference (while it’s being used) by you providing the model with examples directly within the prompt of how to respond or fulfill a task. ICL differs from explicitly training a model for a task.

Google’s ML glossary says ICL is a synonym of few-shot learning, but upon further research, I believe there’s a slight nuanced difference: few-shot learning is a technique within ICL where the prompt includes a small number of examples. Using a “few” examples is likely due the constraints of context window sizes (input tokens) or for better precision.

Why is ICL necessary? Well, since LLMs are trained on large datasets, they can’t possibly learn every specific task or nuance. Few-shot learning allows a model to adapt more quickly to new situations for greater versatility.

Here’s an example prompt for ICL from Google’s ML glossary, which “contains two examples showing a large language model how to answer a query”:

Few-shot prompting example from Google.
Source

Jo Kristian Bergum (who’s talk is featured in the RAG basics article I linked earlier) published a new blog post this week for Vespa.ai about “adaptive in-context learning” (there’s also an excerpt in the AI section later), where he explains:

“In-context learning can be confusing for those with a Machine Learning (ML) background, where learning involves updating model parameters, also known as fine-tuning. In ICL, the model parameters remain unchanged. Instead, labeled examples are added to the instruction prompt during inference.”

– “Adaptive In-Context Learning 🤝 Vespa – part one,” Jo Bergum (Vespa.ai)

His article included example prompts, like the following one “with two examples from the training set”:

In-context learning prompt example.
Source

This illustrates prompt-based learning (or the technique of crafting specific prompts for few-shot learning ✍️), per Google’s ML glossary.

While ICL can make LLMs more adaptable and doesn’t require fine-tuning expertise, performance can also vary depending on the complexity of the task, quality of the examples, and capabilities of the model.

🚨 ⬇️ 👈 Click to continue reading about how ICL works on a more technical level (if you want 😎).

The foundation of ICL lies in the transformer architecture that powers most of today’s LLMs, specifically the mechanisms of self-attention and positional encoding.

Transformer architecture.
Source

Self-attention (red parts above) allows the model to weigh the importance of different words in the input (prompt) based on their context when generating the output (response). This enables the model to focus on the relevant parts of the examples provided during ICL.

Positional encoding helps the model understand the order of words, which is important for interpreting the structure and meaning of the provided examples.

Using its broad understanding of language patterns, syntax, and semantics (and even reasoning abilities) from the pre-training it received on massive datasets, the LLM can then recognize and generalize patterns from the ICL examples.

As Gemini Advanced also pointed out, ICL is still an active area of research, but there’s one hypothesis that says LLMs implicitly learn a wide range of tasks during pre-training, so presenting the model with examples (ICL) allows it to activate the relevant “subnetworks” within its vast parameter space to perform the specific task.

But, as mentioned, how your prompt is crafted plays an important role in guiding the model’s behavior.

Other potential drawbacks with ICL are inconsistency (especially on complex tasks or ones with limited datasets), bias (which can be inherited from the training data and manifest in ICL behavior), and a lack of explainability (or not knowing why a model predicted what it did).

Still, ICL is a promising area that has the potential to revolutionize how we interact with LLMs, especially given larger context windows that allow for more numerous or complex examples.


This week in SEO history: AltaVista retires (2013)

AltaVista was a pioneering search engine created by researchers at DEC (Digital Equipment Corporation) that launched publicly on December 15th, 1995, with an initial index of around 16 million webpages.

AltaVista search engine circa 1996.
View on WayBack Machine.

AltaVista’s multi-threaded crawler named Scooter could perform multiple tasks simultaneously, making it much faster at crawling and indexing webpages than other crawlers of the era. 🌪️

📝 We mentioned transformers in the last section. Standard transformer architectures use multi-headed attention, which focuses on different aspects of a complex problem (input sequence) simultaneously, analyzing the problem from each head’s unique perspective and expertise. Thus, parallel processing, where multiple sources aggregate information, applies in both contexts, resulting in a more efficient and holistic approach.

Combined with its powerful 64-bit Alpha processors (explained more in the dropdown), AltaVista was considerably fast. By 1996, it was handling 19 million request per day (check out the old-school source Perplexity found for that stat), making it one of the most popular search engines.

However, the rise of Google in the early 2000s gradually diminished the popularity of AltaVista (Google Trends chart in the dropdown).

During this week in 2013 (July 8th), the search engine was shut down. 🫡

You can still find AltaVista.com in search results, but it 301 redirects to Yahoo!:

301 redirect of AltaVista to Yahoo!

It still gets a respectable number of organic hits, though, per Ahrefs:

Ahrefs traffic chart for AltaVista.
🚨 ⬇️ 👈 Click to learn more about AltaVista’s history (with an interesting and speculative Google Trends finding) and how it was a pioneer in the search space (if you want 😎).

AltaVista switched ownership hands a few times. 🤝

In 1998, it was acquired (along with DEC) by Compaq (the first brand of desktop computer my family bought, at a Costco in Helena, Montana). In 1999, 83% of AltaVista was acquired by CMGI. Then in 2003, Overture purchased the search engine for $140 million. Later that year (2003), Yahoo! acquired Overture (and AltaVista).

With Google’s rise in the early 2000s, AltaVista’s use waned (but Virginia seemed to really like it, for some reason … ⬟ 🕵️ 🤷):

AltaVista Google Trends data all time.

AltaVista was a search pioneer in several ways. 🤠

It was the first full-text database, so unlike earlier search tools that referenced specific directories or categories (information like titles and summaries), AltaVista indexed the complete text of webpages (and organized them in a structured database), enabling users to search for info buried deep within pages, a major breakthrough that made it easier to explore the web.

It also had a simple and clean interface, with a single search bar, making it more intuitive, especially for the era.

Additionally, AltaVista used 64-bit Alpha processors (a DEC creation) that were cutting-edge at the time and gave the search engine a significant speed advantage for processing queries and larger volumes of searches, making it more scalable.

📝 Recall that in Hamsterdam Part 64’s history section, we featured the N64 game console, released in 1996 and so named for its 64-bit CPU and GPU (though it mostly ran on 32-bit data — blame marketing). GPUs also power many of today’s machine learning models, but the architecture is significantly more powerful. Even still, the concept of using specialized hardware to accelerate complex computations is central to the advancements demonstrated by AltaVista, N64, and today’s ML models.

Speaking of ML today, AltaVista was also the first search engine to allow natural language queries.

It also introduced multilingual searches and was among the first to launch image, video, and audio search capabilities.

Truth be told, I never used AltaVista (that I can recall), but looking back, we can see how its innovations in the ’90s contributed to where search would later evolve to. 🤗


🤙 Introduction to week 66: Understanding

I had two experiences this week that impacted me.

The first was on Thursday night, when I watched a 2016 comedy documentary called “Jeff Ross Roasts Cops.” 🎤 👮

Now, these recaps are called “Hamsterdam” partly as an homage to The Wire.

What I love about The Wire are the representations: police, gangsters, politicians, teachers, elites, blue collar workers, people on the street, families of all styles, and on and on.

What they all have in common is their humanity and surroundings.

One of my favorite lines from the series (S4.E9) is:

Wherever you go, there you are, from The Wire.
Source

I knew there was a reason I identified with it, which I just realized.

Last week, I wrote a post about on-page SEO that referenced mindfulness meditation (which I practice).

It also referenced Jon Kabat-Zinn, an influencer (😉) in the space, and someone whose work the person who introduced me to mindfulness (a therapist and former pro golfer 🏌️) respected greatly.

As it turns out, JKZ wrote a book of the same title as that quote from The Wire:

Wherever You Go There You Are Jon Kabat-Zinn.
Source

The overarching theme, as I’d explain it, is that, while our circumstances may change, we still have to confront ourselves (thoughts and feelings), omnipresent aspects of human nature, and other things that’ll always be beyond our control.

Early in that Jeff Ross comedy documentary, for example, he performs for a squad room full of officers, and they’re stern faced. No laughs.

The reason, aside from his bad jokes (JK 😆), is that they’d seen a photo of him at a BLM protest just before he’d arrived, and they felt a certain type of way.

He broke through slowly and later did a great show at a police benefit, because Jeff Ross is a goat of a human 🐐, but it made me think about how, regardless of our intentions, not everyone is going to get something at first, or maybe at all.

Which leads to the second experience the following day (Friday), when I was walking and listening to Eminem’s new album, The Death of Slim Shady (Coup De Grâce).

Two minutes in, and I felt reminiscences of being back in elementary school, bumping The Slim Shady LP from a bootleg tape made on a friend’s boombox 📻 (because my parents wouldn’t buy it 😂).

To give you an idea, there’s a “Breaking News” skit on the album with a line that goes, “Eminem has released an album in which he is actually trying to cancel himself.” (Update: Didn’t work. 😉)

Some will love the album. Others won’t, either because they won’t understand it, or because they will but dislike it all the same.

And that’s how it should be. 👀

As Mike Tyson once said:

Mike Tyson quote about how a person who is a friend to everyone is an enemy to themself.
Source

Of course, he said “man,” but it’s universal advice 🌏, which even applies to brands, I’d argue.

What does this have to do with SEO, though? 🧐

Well, our field evolves constantly. We have information flying all around us at any given moment — new facts, findings, or opinions.

It’s a big part of what makes SEO fun, imo 🤗, but it can also feel like a barrage and get overwhelming, especially while trying to balance client or company work, personal learning, professional growth, and life beyond it all. 🥵

It’s not only SEOs that can feel this way, though.

AI is evolving so rapidly that many fields are feeling the pressure to stay up on it all. (Try counting how many new ML papers get published in an average week. 😅)

👉 I’ve found the easiest way to navigate everything, for me, is by picking a lane, following a routine, including a choice group of resources or support system, and staying true to core principles. Of course, being adaptable counts for a lot, too. 🙌

As for understanding, there’ll be aspects of what we do that may get misunderstood by some, either temporarily or permanently, but the ones who get it, those’ll be the people who’ll help establish our foundations — who’ll literally “rock” with us — and that’s the solid ground from which we can build something cool that’s ours to admire, celebrate, and share.

Celebrate understanding, just don’t sweat it if it doesn’t come 100% of the time.

That’s how it should be. 🔥

Buckle up for a full week’s recap, and enjoy the vibes (Black Sabbath, but starting with a side that’ll maybe surprise those unfamiliar with their deeper catalog 🤘):

YouTube comment about Ozzy singing Solitude.
Comment on Solitude video.

Thank you for supporting Hamsterdam and the cause of SEO & AI learning. 🙏

Missed last week? Don’t worry, I got you! Read Part 65 to catch up.

🌟 Other great sources of weekly SEO news:


Time for our weekly review of SEO social posts, articles, & more …

⚡️ Quick summary:

Jump to a section:

Or keep scrolling to see it all. ⏬

Now, let’s step inside the white flags of Hamsterdam …

The Wire Hamsterdam screenshot for setting up inside the white flags.

📰 SEO news, Google updates, SERP tests & notable posts

Notable updates or news related to Google Search or related SEO topics (aka, the Barry section).

Excerpt: “Google is now showing the translated search results feature in more languages, adding [mystery, gotta read the story 😂] to the list of now 21 languages.”
Excerpt: “Shipping speed, cost, and return policy are major factors considered by shoppers when buying products online. … We’re happy to announce an easier way to add your shipping or return information directly in Search Console. … Note: Search Console shipping and return settings take precedence over configuration on your website, including product-level merchant listing markup. Learn more about what takes precedence.”

🍟 SEO tips & tidbits

Actionable tips, cool tidbits, and other snackable findings and observations that can be teaching moments.

Nice find, Olaff! 💐

Paper excerpt: “We study the performance of both fine-tuned LMs, and in-context few-shot LLMs … We fine-tune different sizes of T5 models … We also fine-tune FLAN instruction-tuned T5 models … We evaluate variants of few-shot PaLM2 … We construct a dynamic prompt that collects examples for each test question 𝑞𝑖 by retrieving the most similar questions from the QuoteSum training set according to Sentence-T5 embedding cosine similarity (QSum). We also experiment with the ALCE prompt that uses fully-abstractive answers with in-line citations …” (arXiv link. Contributors: Tal Schuster, Adam D. Lelkes, Haitian Sun, Jai Gupta, Jonathan Berant, William W. Cohen, Donald Metzler)

Why it matters: SEMQA is a new task that existing LLMS can perform (mentioned above are models (T5, PaLM2, etc.) and methods, including fine-tuning and in-context learning, like our AI vocab this week 🙌). Instead of just summarizing information, or copying it word-for-word, SEMQA combines both approaches, extracting factual statements (quotes) directly from the original sources and combining them with AI-generated words and phrases. 👉 A version of this was kind of what I was hinting Perplexity should do by including quotes from their interviews. AI-generated text needs human intervention once in a while, imo. 😇

Here’s an example of SEMQA from the paper (Figure 1):

Figure 1 from SEMQA paper.
Note: “Creators” stuck out to me, too, when reading the article. I saw it as Google aligning with social media terminology (“content creators”), at least in part. That’d be forward thinking from a cultural POV, imo, not to mention how much social content appears in Search already, and is perhaps a precursor of more to come.

🦕 SEO (and AI) fundamentals & resources

Essential information, concepts, or resources to learn about SEO or AI.

Excerpt: “Information gain refers to a score that indicates the additional information included in a document beyond the information contained in documents previously viewed by a user. … Techniques involve data from documents being applied across a machine learning model to generate an information gain score, assisting in presenting documents to the user in a manner that prioritizes those with higher scores of new information.”

📚 Articles, videos, case studies & more

Longer-form content pieces shared on social, in newsletters, and elsewhere.

Critical SERP Features of Google’s shopping marketplace – Kevin Indig, Growth Memo

Critical SERP Features of Google’s shopping marketplace
Excerpt: “Yes, AIOs are impactful … However, Google shows a whole slew of SERP Features and AI Features for e-commerce queries that are at least as impactful as AIOs. To better understand the key trends for shopping queries, I analyzed 35,305 keywords across categories like fashion, beds, plants and automotive in the US over the last five months using seoClarity.”
Excerpt: “Identifying the role of ranking factors and their importance for a SERP is tricky because different ranking factors may not correspond to rankings in a linear or consistently increasing/decreasing way.”
Excerpt: “Reviewing the SERPs to understand why Google is ranking something is a good idea. But reviewing it with just a handful of dimensions, a limited amount of ‘signals’ can be frustrating and counterproductive.”
Excerpt: “After examining the patent underlying AI Overviews, I’ve identified several reasons why the system might cite such low or unranked sources. This article aims to explore these explanations in detail.”

I Disavowed Every Link To My Website. Here’s What Happened – Cyrus Shepard, Zyppy

I Disavowed Every Link To My Website. Here’s What Happened
Excerpt: “Nearly two months ago, I disavowed every link to this website listed in Google Search Console and every link listed by Ahrefs. Trust me, it was a lot of good links.”This was an entertaining read! Should I nofollow the link above? (JK. 😆)

Excerpt: “Every influencer and their mother has mentioned how waking up at 4 am has changed their lives. I follow Alex Hormozi a LOT, and he talks about this thing almost every time. The basic idea that I got from him is that you get a 4-hour block that you should use to do your own work before doing any other work. … I started this on 18th June btw and it lasted 4 days.”

Excerpt: “For those wondering, we are still 90% down, I have unfortunately had to get rid of my entire team, bar one, who is my editor-in-chief and one of my good friends. We’re hanging on for dear life, and still losing money to keep Retro Dodo afloat. I have spent all of my businesses money on keeping my team onboard for as long as I can, but 10 months of loss has eradicated any profits I have made over the lat 5 years. It’s not a fun time, especially as I have just become a father.”
Note: I really enjoyed hearing Zoe’s expertise and perspective in this episode.

🧑‍💻 Technical SEO

Everything from basics to advanced moves (and also tools).

✍️ Content marketing

From what is helpful content to user journeys and beyond.

I accidentally got featured in The Guardian (here’s what you can learn from it) – Mark Rofe, The Digital PR Newsletter #26 (but using LinkedIn version)

I accidentally got featured in The Guardian (here's what you can learn from it)
Excerpt: “After making only 1 sale, I abandoned the website some time in 2013, but before I did, I had this infographic (these were all the rage back then) made by my friend’s younger brother. … But I was dumbstruck to find it had been featured and linked to in The Guardian, TWICE in the same article.”This was a great read! Enjoyed the approach.
Gary Illyes LinkedIn post about lava and machine translation.
View Gary’s post on LI.

Perplexity planning revenue sharing program with web publishers next month – Emilia David, VentureBeat

Perplexity planning revenue sharing program with web publishers next month
Excerpt: “AI chatbot Perplexity will begin a revenue-sharing program with web publishers next month, the company announced during VB Transform Thursday. Dmitry Shevelenko, chief business officer at Perplexity, said the program will stem from advertising the company plans to run alongside search queries on the Perplexity platform. … He added that revenue sharing will be based on links and results throughout Perplexity, not just its paid Perplexity Pro offering, but also from any links and ads run on the free Perplexity option. Partners will get a percentage of revenue from every ad run against a result, and if a link from a partner’s website is cited, they are entitled to that cash.”

📍 Local SEO

From Google Business Profiles or reviews and more!

📊 Data analysis & reporting

Showing that what you’re doing is helping.

Excerpt: “Google Search Console is a free website analytics tool Google provides. Google Search Console tracks your website’s performance in search results on Google. As an SEO director, I use Google Search Console daily. I check website performance for content updates and troubleshoot any technical changes. It helps me make informed business decisions about where to dedicate my team.”

5 pro tips in Google Sheets from a veteran finance leader – Eric Refuerzo, Google Workspace

5 pro tips in Google Sheets from a veteran finance leader
Excerpt: “Gemini in Sheets saves me a lot of time, especially with the dedicated side panel as a research partner right where I’m already working. It can connect me to external data sources, allowing me to pull in reliable information without ever leaving my spreadsheet. … For richer visualization, Google Sheets and Data Studio allow me to create dynamic reports that update in real-time so that everyone in my team has access to the latest data. Appsheet is another great tool that I’ve recently experimented with, and I’ve seen colleagues use it to create custom apps to view visual outputs of production or status tracking.”

🤖 AI, machine learning, & LLMs

News related to models, papers, and companies.

Darwin Santos LinkedIn post about interactive Claude Artifacts features from data.
Visit Darwin’s post on LI.

Adaptive In-Context Learning 🤝 Vespa – part one – Jo Kristian Bergum, Vespa.ai

Adaptive In-Context Learning by Vespa - part one
Excerpt: “At inference time, we want to identify the labeled training examples with the most significant impact on the accuracy of the downstream task. This process can be formulated as an information retrieval problem where the unlabeled input example is the query, and the labeled training example is our corpus. Instead of a fixed static set of examples always added to the prompt, we move to context-sensitive or adaptive examples that LLM prompt.”

Google DeepMind Introduces a Parameter-Efficient Expert Retrieval Mechanism that Leverages the Product Key Technique for Sparse Retrieval from a Million Tiny Experts – Aswin Ak, MarkTech Post

Google DeepMind Introduces a Parameter-Efficient Expert Retrieval Mechanism that Leverages the Product Key Technique for Sparse Retrieval from a Million Tiny Experts
Excerpt: “Despite the promise of MoEs, as demonstrated by researchers like Shazeer et al. (2017) and Lepikhin et al. (2020), these models face computational and optimization challenges when scaling beyond a small number of experts. … The Researchers from Google DeepMind propose a novel approach called Parameter Efficient Expert Retrieval (PEER), which specifically addresses the limitations of existing MoE models. PEER leverages the product key technique for sparse retrieval from a vast pool of tiny experts, numbering over a million. This approach enhances the granularity of MoE models, resulting in a better performance-compute trade-off. … The findings suggest that PEER can effectively scale to handle extensive and continuous data streams, making it a promising solution for lifelong learning and other demanding AI applications.”

Why it matters: PEER is a new architecture for MoE models (which enhance LLM capabilities, such as in Gemini 1.5) that allows for efficient scaling to a massive number of experts. This could lead to more powerful and accessible AI models that could adapt continuously through lifelong learning. In theory, such advancements to LLMs could also lead to more sophisticated search algorithms. (View paper on arXiv. Contributors: Xu Owen He)

Microsoft Drops From OpenAI’s Board – Perplexity Team

Microsoft Drops From OpenAI's Board on Perplexity
Excerpt: “The departure of Microsoft from OpenAI’s board has prompted a strategic shift in the AI company’s approach to partner engagement. OpenAI plans to establish a new system of regular stakeholder meetings involving key partners like Microsoft and Apple, as well as investors such as Thrive Capital and Khosla Ventures. … Following Microsoft’s lead, Apple also decided not to take up an observer position on OpenAI’s board. … The move by both tech giants to distance themselves from direct board involvement in OpenAI reflects a broader trend in the industry, as companies seek to mitigate potential antitrust concerns and maintain their independence in the rapidly evolving AI landscape.”
Excerpt: “At a very high level, the input to the GraphRAG pipeline are source documents containing various information. The documents are processed using an LLM to extract structured information about entities appearing in the papers along with their relationships. This extracted structured information is then used to construct a knowledge graph. The advantage of using a knowledge graph data representation is that it can quickly and straightforwardly combine information from multiple documents or data sources about particular entities. … After the knowledge graph has been constructed, they use a combination of graph algorithms and LLM prompting to generate natural language summaries of communities of entities found in the knowledge graph. These summaries then contain condensed information spreading across multiple data sources and documents for particular entities and communities.”
Excerpt: “Google’s local LLM Gemini Nano, which runs competely in the browser, is available in recent versions of Chrome Canary. Gradio is a Python package that allows you to create Web UIs upon your models in a few lines of Python code, and Gradio-Lite is its in-browser version that also runs completely in the browser. With a combination of these two, you can create a local LLM-based web app only with Python.”
Excerpt: “Participants in our experiments provided detailed, open-ended explanations of a conspiracy theory they believed, and then engaged in a 3 round dialogue with a frontier generative AI model (GPT-4 Turbo) which was instructed to reduce each participant’s belief in their conspiracy theory (or discuss a banal topic in a control condition). Across two experiments, we find robust evidence that the debunking conversation with the AI reduced belief in conspiracy theories by roughly 20%.” (Paper link on osf.io.)

🤔 General marketing & miscellaneous

This is for great content that isn’t necessarily SEO or marketing-specific. PPC, PR, dev, design, and social friends, check it out!

Progressive Web Apps (PWA): A Comprehensive Guide – Kasie Udoka Success, Dev.to

Progressive Web Apps (PWA): A Comprehensive Guide
Excerpt: “A Progressive Web App (PWA) is a type of web application that combines the best features of traditional websites and native mobile apps. PWAs are designed to be fast, reliable, and engaging, providing a native app-like experience on the web. … PWAs are discoverable by search engines, improving visibility and reach. … Many big companies such as YouTube, Facebook, and even Dev.to made their web apps progressive (installable).”

17 Libraries to Become a React Wizard 🧙‍♂️🔮✨ – Anmol Baranwal, CopilotKit, Dev.to

17 Libraries to Become a React Wizard
Excerpt: “A lot of developers work on a lot of data these days (mostly using APIs). So, a method to easily visualize that data is a cool concept that can take the app to the next level. Victory is an ecosystem of composable React components for building interactive data visualizations.”

💎 Older stuff that’s good!

Not everything I find worth sharing is new as of this week, so these are gems I came across published in the past.

All Killer No Filler: Metrics That Matter in Local SEO [Video] – Jessie Low, Kick Point Playbook

All Killer No Filler: Metrics That Matter in Local SEO [Video]
Excerpt: “Remember, when reporting on any metrics you want to ensure that you are tailoring your reports to best reflect what clients and stakeholders care about. Don’t overload them with metrics that they can’t use and are meaningless to them. When you start to report on every single possible metric you open yourself up to questions about why a metric is moving up or down, and if you can’t provide an answer to your client, you will look silly. It can also impact your trustworthiness and their confidence in your expertise.”

Total noob’s intro to Hugging Face Transformers – Andrew Jardine, Hugging Face

Total noob’s intro to Hugging Face Transformers
Excerpt: “Our goal is to demystify what Hugging Face Transformers is and how it works, not to turn you into a machine learning practitioner, but to enable better understanding of and collaboration with those who are. … Hugging Face Transformers is an open-source Python library that provides access to thousands of pre-trained Transformers models for natural language processing (NLP), computer vision, audio tasks, and more. It simplifies the process of implementing Transformer models by abstracting away the complexity of training or deploying models in lower level ML frameworks like PyTorch, TensorFlow and JAX.”

Streamlining Embedding with LangChain: Harnessing Hugging Face Without Downloads – Charan H U, Medium

Streamlining Embedding with LangChain: Harnessing Hugging Face Without Downloads
Excerpt: “LangChain serves as a bridge between developers and powerful NLP resources, abstracting away complexities and streamlining the integration process. One of its standout features is the seamless integration with Hugging Face’s Inference API, enabling users to tap into a treasure trove of embedding models without the need for local installations or downloads.”

Great job making it to the end. You rock! 🪨

🤝 Want help with your SEO strategy?

I’m an independent SEO consultant focusing on custom audits and holistic strategies for brands. Don’t hesitate to reach out, or visit my about page for more information.

Let’s connect!

Hit me up anytime via text or call at 813-557-9745 or on social or email:

Cheers! ✌️

Editorial history:

Created by Ethan Lazuk on:

Last updated:

Need a hand with a brand audit or marketing strategy?

I’m an independent brand strategist and marketing consultant. Learn about my services or contact me for more information!

Leave a Reply

Discover more from Ethan Lazuk

Subscribe now to keep reading and get access to the full archive.

Continue reading

GDPR Cookie Consent with Real Cookie Banner