🐹 Hamsterdam part six. (10/27 to 11/3, 2024)
A weekly marketing and AI content recap.

Welcome to the sixth week of the renewed Hamsterdam! 🐹
The focus here is on finds from my Google Discover, X, and TikTok feeds 🤳 🌏, with attention paid to stories you might not see in your ordinary travels through SEO or marketing circles.
Before we dig in, here’s a quick introduction …
With some thoughts about the future of SEO.
Yesterday I had the pleasure of being introduced to ChatGPT search.
I think a lot of attention has been paid to its local results, and rightly so, given the usefulness of its maps.
That said, I had a bit of success using it for commercial purposes, in shopping for some new shoes.
As is my opinion of Perplexity, I think the sources in ChatGPT search pertain more to credibility than referral traffic.
I also just read that ChatGPT may create dedicated webpages for user queries:
This wouldn’t necessarily change how I generate content for clients, but it does further emphasize that business metrics will be important.
We’ll most assuredly see clicks decline and rankings fluctuate, but how does overall visibility in AI surfaces contribute to engagement (qualified clicks) and conversions.
Now, on to the show! 🎤
Let’s review this week’s choice of stories from Google Discover (followed by X and TikTok).
Excerpts go to the authors. Bolding is mine.
1. EXCLUSIVE: Perplexity Is Quietly Building an AI-Powered Shopping Experience, Taking On Amazon – Ad Week

Excerpt:
“Perplexity is now inviting users to gain early access to its new feature, dubbed “Pro Shop,” which allows them to research and purchase products from various merchants directly on its platform, according to an internal email obtained by ADWEEK. …
While browsing for products, users will notice a “Buy with Pro” option for eligible items. After selecting it, they will need to enter their billing and shipping details to finalize their purchase.”
2. Beyond the funnel: A new approach to content marketing – MarTech

Excerpt:
“Far too many content marketing programs fail to connect the brand to customer problems before the customer starts looking for a solution. At this crucial and overlooked point in the customer journey, most brands provide content about the product and the industry, when they should address specific customer pain points. …
Instead of the buyer’s journey, marketers must focus on establishing mental availability: Having their brand come to mind when a buyer encounters a problem. Doing this requires using category entry points and having content that associates your brand with the specific problems it solves. …
“How do you map content to this? How do you actually go about creating a content section in your overall program that helps you address these category entry points?” Moroney asked.
By focusing on problem-centric content and thinking about content not just on the funnel level, but in terms of what it is trying to communicate to the audience. Fortunately, because your product is the solution, you already know what the problems are.”
3. Decoding Google’s “Attention Is All You Need” Paper and NLP’s Evolution – Times Pro

Excerpt:
“The transformer neural network is a novel architecture that aims to solve sequence-to-sequence tasks while handling long-range dependencies with ease. It was first proposed in the paper “Attention Is All You Need.” It is now a state-of-the-art technique in NLP. The key innovation of Transformer is its ability to capture long-range dependencies and contextual information in sequences more effectively. …
At the heart of the Transformer lies the concept of attention—a mechanism empowering neural networks to focus on specific input data segments. This capability is particularly crucial in NLP tasks like machine translation, where attention enables the model to prioritise relevant elements within the source sentence, facilitating accurate translations in the target language. …
Unlike traditional models that process data sequentially, the Transformer adopts a concurrent processing paradigm, making it faster and more efficient. At the core of this innovation lies its adept utilisation of attention mechanisms, empowering the decoder to dynamically attend to the encoder’s hidden states. This enables the model to grasp long-range dependencies within the input text, enhancing performance across a spectrum of NLP tasks. …
The “Attention Is All You Need” paper represents a pivotal moment in the annals of machine learning and NLP, introducing a more efficient and potent model architecture. Its legacy extends beyond academia, shaping the development of advanced models like BERT and GPT-3, which now find application across diverse domains.”
Now let’s review this week’s X content.
Finally, let’s review this week’s choice of TikTok content.
PSA: scroll slowly on mobile, to avoid timing out.
Thanks for checking out the new Hamsterdam! 🐹
Until next time, enjoy the vibes:
Thanks for reading. Happy marketing! 🤗
Leave a Reply