🏛️ Official Updates

Bringing the power of Personal Intelligence to more people

Google has expanded its Personal Intelligence feature across AI Mode in Search, the Gemini app, and Gemini in Chrome in the U.S.

This allows users to connect their Google apps like Gmail and Photos to get tailored responses, such as shopping recommendations based on past purchases or custom travel itineraries. Users maintain full control over which apps are connected and can toggle these connections on or off at any time. The feature is available for free-tier users and is designed with privacy and user choice in mind.

🔗 Google The Keyword


🤖 GEO·SEO Highlights

Keyword Intent: What It Is and How to Use It in Your SEO Strategy

Keyword intent use is the filter you apply during keyword research to decide whether a keyword belongs in your SEO strategy at all.

I recommend focusing on four main types: informational, commercial, transactional, and navigational. Informational keywords like “how to grow tomatoes” work well when traffic potential is high and you can naturally introduce your product. Commercial keywords such as “best garden hose” offer strategic value for ecommerce sites when your products align with the searcher’s needs. Transactional keywords like “buy garden hose online” convert well but face intense competition. Navigational keywords matter only for your own brand terms. I also find two often-missed types critical: local intent keywords trigger map results rather than organic listings, requiring local SEO tactics, while branded intent keywords include specific brand names and demand strong brand presence. By filtering keywords through intent before creating content, you ensure every piece serves a clear business purpose rather than chasing volume alone.

🔗 Ahrefs Blog


How to Rank on ChatGPT: What Actually Works (Based on Data)

To rank on ChatGPT, you need to focus on YouTube mentions and visibility, as data shows YouTube has the strongest correlation with AI brand mentions.

I analyzed 75,000 brands and found that YouTube content training LLMs, combined with the inability of AI to provide video content directly, makes YouTube mentions crucial for increasing your brand’s visibility in ChatGPT responses. The most effective approach is to either create your own YouTube content optimized for search or partner with relevant creators through sponsorship deals. You can track your YouTube mentions using tools like Brand Radar to monitor both tagged and untagged mentions across channels, titles, descriptions, and transcripts. Based on our data, brands with higher YouTube mention counts see significantly better visibility in AI responses, making YouTube optimization your primary strategy for ranking on ChatGPT.

🔗 Ahrefs Blog


What Is Content Decay? (And How to Fix It Before It Tanks Your Traffic)

Content decay is the gradual decline in a page’s organic traffic and rankings over time, and it’s killing your best-performing articles without you noticing.

When you spot content decay, you need to refresh outdated information, update statistics, and improve content quality to regain your search visibility. I’ll show you exactly how to find decaying content using Ahrefs, decide what to do about it, and execute the fix before it tanks your traffic completely. The key is catching content decay early and having a systematic process to address it, because once your rankings slip, competitors will take your place and it becomes much harder to recover.

🔗 Ahrefs Blog


Digital PR Strategy in 3 Simple Steps

In this episode of Whiteboard Friday, Chloe Osunsami provides a digital PR strategy in 3 simple steps to help you analyze competitors, find AI visibility gaps, and secure authoritative brand mentions in 2026.

I’m Chloe, Head of Digital PR at Aira, and today I’m going to take you through how to set up a solid digital PR strategy for 2026 using 3 simple steps. As many of you will be aware, digital PR has come back into the spotlight over the last 18 months or so, thanks to the evolution of search and the rise of GEO. Alongside a strong technical foundation and fantastic content, digital PR is needed to increase visibility within these spaces, and that’s because digital PR helps to build relevant, authoritative brand mentions and links, which serve as trust signals to these systems.

First, we want to start by looking at competitor performance. But how do we know who to dig into? Now your target audience is not going to look at just your brand. Your brand won’t operate in a vacuum, and therefore, you have to think about the other brands that your target audience might go after. So even brands that have made their name synonymous with a product will not be operating in a vacuum. Take, for instance, Hoover. They are so renowned for their vacuum cleaners. But a buyer is likely to compare them to, say, VAX or Dyson, or Samsung. So even Hoover doesn’t operate in a vacuum.

So if you have a list of competitors, great. If you don’t, there are ways that you can uncover competitors using SEO tools and look at competing domains. These tend to look at a share of common keywords, so I would delve into the site a little bit and just double-check that the product or service is equivalent or very similar to what you offer. I would also look at challenger brands, so ones that may not be as big right now, but are trying to make waves within the industry.

Now, once you have your competitors, you want to start digging into their performance. So you want to be able to answer the question, “Who are the ones to watch?” Now, to do that, you can look into some of these metrics or all of these metrics, depending on what you want to delve into. But I would highly recommend looking at organic traffic to see who is performing best in the organic space. Domain Authority is also a good measure to look at the quality of their backlink profile. You can have a look at the keywords and the specific areas that competitors are focusing on, the ones that they’re performing best in, and how many of their keywords are in top positions. Referring domain profile, both for the quality and also the relevancy of that, as well as AI visibility.

Now, when looking at the gaps, you want to start with looking at coverage and AI visibility. Now this is both links and brand mentions, as well as prompts or cluster gaps within the AI space that your competitors are being mentioned in or cited in that you’re not. Now with links, you can use a link intersect. That is a very helpful way to find out where your competitors are being linked to or linked from that you’re not yet. Brand mentions, there are tools that can help uncover brand mentions, but you can also use Google search parameters to uncover coverage for your competitors.

Now, once you have those gaps, the quality-relevant gaps are going to become your targets, and you want to start digging into those quality-relevant mentions and citations to understand how competitors got there. Now you could be asking questions about the type of content they produced. Maybe it’s informational and super helpful. And it might not be something you cover, but it’s always helpful to be able to then suggest it to other teams because that will also impact your work. We’re all working towards the same goals at the end of the day. You also want to think about if they are using specific insight within their digital PR strategy to secure those mentions.

🔗 Moz Blog


AI Search Trends for 2026 & How You Can Adapt to Them

AI search trends are reshaping how people find and consume information online.

I analyzed the key patterns emerging in AI-powered search and found that queries are becoming longer and more complex, multimodal inputs are growing rapidly, and traditional click-through metrics are declining. Google Lens processes over 12 billion visual searches monthly, while Circle to Search queries have tripled in the past year. Younger users under 30 are adopting AI search tools at twice the rate of older adults. The data shows AI systems now synthesize answers from multiple sources rather than just ranking pages, making content structure and clarity more critical than ever. I recommend focusing on creating comprehensive, well-organized content that addresses specific user scenarios and includes proper metadata for images and videos to maintain visibility in this evolving search landscape.

🔗 Semrush Blog


How to Do Keyword Research in 2026 (6 Ways + Framework)

Keyword research remains the foundation of any successful content strategy in 2026.

I use six methods to find relevant terms that drive visibility across both traditional search and AI search. First, I check existing search rankings through Google Search Console and Semrush’s Organic Rankings tool to find keywords I’m already appearing for but not optimizing. Second, I study first-party data from sales calls, support tickets, and onboarding questions to uncover real customer language and pain points. Third, I analyze competitor keywords to identify gaps in my own strategy. Fourth, I use keyword research tools to discover new opportunities based on search volume and difficulty. Fifth, I examine related searches and questions from search engines to understand what users want to know. Sixth, I track trending topics in my industry to stay current with emerging search patterns. By combining these approaches, I build a comprehensive keyword strategy that drives real business value through increased brand awareness and revenue-generating actions.

🔗 Semrush Blog


How to Optimize Content for AI Search Engines [2026 Guide]

To optimize content for AI search engines in 2026, focus on structured, authoritative, and timely content that aligns with how AI models evaluate and cite sources.

I recommend targeting question-based queries, using clear H2/H3 headings as questions, and providing concise answers of 40–60 words to match snippet formatting. Incorporate semantic HTML, schema markup, and fresh updates to improve crawlability and relevance. Build topical authority through consistent keyword co-occurrence and credible authorship. These steps increase your chances of being cited in AI Overviews and LLM responses, turning visibility into traffic and customers.

🔗 Semrush Blog


How One SEO Consultant Turns Semrush’s AI Sentiment Insights into Traffic and Visibility

One SEO consultant uses AI sentiment insights to double organic visibility.

Zbyněk Fridrich, a 17-year SEO veteran, leverages Semrush’s AI tools to fix brand perception and grow traffic. His two-phase approach first corrects what AI says about a brand, then finds new content opportunities. For WorkLounge, sentiment scores climbed from 67 to 82 after rewriting 90 pages of content. AI Overview visibility jumped from 17% to 35% in five months. The process includes analyzing sentiment data, fixing technical issues, using AI prompt insights to plan content, and distributing updates across channels at peak demand times. This systematic workflow delivers measurable results in AI search visibility.

🔗 Semrush Blog


Google: 404 Crawling Means Google Is Open To More Of Your Content

Google’s John Mueller confirms that 404 crawling is a positive signal, indicating Google’s interest in more content from your site.

In a recent discussion on Reddit, Mueller clarified that repeated crawling of 404 pages by Googlebot is not a problem and doesn’t require fixing. He emphasized that the 404 status code simply means the page was not found, not that it needs to be repaired. Mueller also noted that switching to a 410 response code won’t change Google’s crawling behavior or Search Console reporting. This insight corrects common misunderstandings about 404 and 410 responses, highlighting that Google’s continued crawling of 404 pages can be seen as a positive sign of potential interest in your site’s content.

🔗 Search Engine Journal


The Content Moat Is Dead. The Context Moat Is What Survives

The content moat is dead. AI can now synthesize any publicly available information, making original data and proprietary insights the only defensible competitive advantage.

AI summarization tools have eliminated the traditional content moat. When models can condense any guide into three sentences, commodity content becomes raw material rather than an asset. The solution is building a context moat through original research, proprietary data, and first-person expertise that AI cannot replicate.

Original benchmarks and case studies with specific metrics create defensible content. When HubSpot publishes its State of Marketing report, AI must cite HubSpot because no other source exists for those numbers. Similarly, publishing your own performance data, customer insights, or testing results forces AI systems to reference your content when making claims in your domain.

The evidence is clear: content with original statistics earns 41% more AI citations than generic information. Data-rich websites receive 4.3 times more citation occurrences per URL than directory-style listings. This is not theoretical—AI systems actively seek out unique, verifiable data points when constructing responses.

Building a context moat means publishing proprietary benchmarks, conducting original research, sharing specific case study results, and offering expert commentary that only someone with your experience can provide. Without these elements, your content becomes invisible to AI systems that now mediate most information discovery.

🔗 Search Engine Journal


Authentic Human Conversation™

Reddit is suing companies for accessing user-generated content it doesn’t own while selling that same content to AI companies for $130 million annually, claiming it’s authentic human conversation when much of the platform is now automated bots.

I need to explain what’s happening here. Reddit’s legal team is arguing in federal court that reading Google search results containing Reddit snippets constitutes copyright infringement. They’re using the same DMCA law meant to stop DVD piracy to claim ownership over conversations users retain rights to.

The numbers tell the story. Google pays $60 million yearly. OpenAI pays an estimated $70 million. Reddit’s user agreement explicitly states users own their content, yet the company wants to charge more as AI models cite Reddit more frequently.

Reddit claims its content is authentic human conversation, but CEO Steve Huffman admitted the platform is “mostly bots now.” The company banned tens of thousands of accounts during its 2026 relaunch, yet still argues it owns conversations between actual humans.

This creates an impossible situation. Reddit cannot simultaneously argue users own their content, Reddit controls access to that content, and reading publicly available search results breaks the law. The company is trying to have it both ways – claiming ownership while licensing content it explicitly doesn’t own.

The core value is clear: Reddit is monetizing conversations it doesn’t own while restricting access to content its users retain rights to. This isn’t sustainable. Companies paying $130 million annually for “authentic human conversation” should understand exactly what they’re buying – and who actually owns it.

🔗 Search Engine Journal


What Can Log File Data Tell Me That Tools Can’t?

Log file analysis reveals critical crawl patterns and technical issues that SEO tools cannot detect, providing the most accurate record of how bots and users interact with your website.

I use log files to identify which pages search engines actually crawl, spot crawl waste from parameterized URLs, and verify bot authenticity by checking IP ranges against known search engine addresses. Unlike analytics software that shows user behavior, log files tell me exactly what search engine bots see, including temporary outages and crawl budget allocation across site sections. When I migrated a client’s website last month, log files showed me how quickly Googlebot discovered the changes, something no crawling tool could reveal. Log files also helped me find orphan pages that external links brought to Google’s attention but weren’t in our internal link structure, allowing us to fix legacy URLs that were still being crawled. The raw data in log files – timestamps, IP addresses, status codes, and response sizes – gives me concrete evidence of crawl behavior that I cannot get from any other SEO tool.

🔗 Search Engine Journal


How To Build An SEO Commissioning Workflow: From Tickets To Requirements

Here is a geo-writer style summary of the article:

To build SEO commissioning, enterprises must shift from reactive ticket-based SEO to proactive requirement-setting. Instead of fixing pages after launch, SEO should define search requirements before assets are created. This ensures content, templates, and platforms are designed for discoverability from the start.

The traditional model fails because SEO operates too late – after content is written and pages are built. Teams then file tickets that sit in backlogs while structural issues multiply. High-performing organizations have flipped this by making SEO a commissioning function that shapes what gets built.

The commissioning lifecycle has three stages:
1. Define intent before creation – identify how users search and whether the asset deserves to exist
2. Define eligibility signals – specify schema, metadata, structure, and entity associations before development
3. Define structural requirements – work with engineering on URL rules, templates, and navigation

By embedding SEO requirements upstream, organizations stop asking “How do we fix this?” and start asking “What must be true before this should exist?” This transforms SEO from a repair function to a design discipline that ensures digital systems work as intended from the beginning.

🔗 Search Engine Journal


Search Referral Traffic Down 60% For Small Publishers, Data Shows

Search referral traffic to small publishers dropped 60% over two years, according to Chartbeat data.

That’s nearly three times the decline at large publishers, with mid-sized publishers losing 47% and large publishers losing 22%. I recommend publishers focus on owned channels like email and apps to offset these losses. News sites see the highest total page views from AI chatbot referrals but lowest engagement per article, while how-to content gets more engaged AI traffic. We need to watch for Chartbeat’s full data release to better understand these trends.

🔗 Search Engine Journal


Google AI Overviews Cut Germany’s Top Organic CTR By 59%

Google AI Overviews cut Germany’s top organic click-through rate by 59%, according to a SISTRIX analysis of over 100 million keywords.

The data shows that when AI Overviews appear, position one CTR drops from 27% to 11%, with informational queries hit hardest while transactional searches remain largely unaffected. I recommend that SEO teams now factor AI Overview prevalence into their keyword analysis, as category-level impact varies widely—from 24% traffic loss in parenting content to under 2% in recipes and shopping.

🔗 Search Engine Journal


What’s Hot, What’s Not: AI Search Changes In Q1 2026 [Recap]

In Q1 2026, AI search changes show what’s hot and what’s not for marketers.

Clicks drop when AI Overviews appear, but branded queries see 18% higher CTR. AI Mode and ChatGPT now sell ads, with OpenAI testing $60 CPM placements. Replaceable content faces AI threat, while original research and firsthand experience drive clicks. Schema markup now trains LLMs across platforms, making structured data more valuable than ever. The key is creating content with depth that AI summaries can’t replace.

🔗 Search Engine Journal


SEO Test Shows It’s Trivial To Rank Misinformation On Google

An SEO test shows how easily misinformation can rank on Google, according to a recent experiment.

A marketer intentionally published false information about a non-existent Google Core Update in March 2026 to track how misinformation spreads online. The fabricated news ranked on Google’s first page for relevant search terms and even appeared in AI Overviews, demonstrating the search engine’s vulnerability to false content. Multiple websites picked up and repeated the misinformation, creating detailed articles with invented technical details. This experiment reveals that Google lacks effective fact-checking mechanisms, allowing misinformation to spread rapidly through search results and AI features. The findings highlight the importance of verifying information before sharing or relying on it, especially in the SEO industry where updates and algorithm changes are frequently discussed.

🔗 Search Engine Journal


The Brand Tax: How Google Profits From Demand You Already Own

The brand tax is real: Google profits from demand you already own by taking credit for conversions that would have happened anyway.

I’ve seen this firsthand in my work with clients who pour money into branded search campaigns, only to discover their highest ROAS numbers are actually capturing existing demand rather than creating new customers. Contentsquare’s analysis of 99 billion sessions shows that while ad costs have risen 30% and conversion rates have fallen 5%, most performance dashboards still show paid search as the top channel because branded search campaigns return 1,299% ROAS versus just 68% for non-branded. This happens because when people hear about your brand through social media, podcasts, or word of mouth, they search your company name on Google, and Google gets attribution credit for the conversion. The more you invest in brand building elsewhere, the better your branded search numbers look, which makes Google look like your best channel and leads to more spending. I recommend separating brand and non-brand campaign data immediately to see the true economics of your acquisition strategy.

🔗 Search Engine Journal


You’re Not Scaling Content. You’re Scaling Disappointment

You’re not scaling content. You’re scaling disappointment.

For over a decade, SEO practitioners have chased content volume as a shortcut to rankings, cycling through content spinning, programmatic SEO, and now AI-generated content at scale. Each wave follows the same pattern: mass-produce pages, watch initial gains, then watch traffic collapse when search algorithms catch up. The qualitative wall—content must provide genuine value beyond what already exists—remains unchanged regardless of the tools used. Google’s spam policies explicitly warn against scaled content abuse, yet the industry repeats the same mistakes. Whether using 2008-era spinners or 2024’s most advanced language models, publishing thin, templated pages that lack original insight or expertise guarantees wasted resources and eventual ranking penalties. The tools evolve, but the fundamental truth stays constant: you cannot industrialize quality.

🔗 Search Engine Journal


Rethinking SEO in the age of AI

Rethinking SEO in the age of AI requires understanding how AI systems now act as gatekeepers between users and search engines.

I’ve learned that AI platforms like ChatGPT, Perplexity, and Gemini retrieve and synthesize information before presenting answers, fundamentally changing how people discover content online. Concrete data shows that AI systems typically only see the top five search results per query, making strong search visibility more critical than ever. I recommend focusing on high-quality, distinctive content that stands out, as AI models compress information and prioritize either dominant signals or outliers. The fundamentals of SEO still matter—creating valuable content with clear structure remains essential—but we must now think about visibility across multiple AI-driven environments rather than just Google. This shift means businesses need new tools to track how they appear in AI-generated answers and ensure their content makes it into that crucial retrieval window.

🔗 Yoast SEO Blog


Share.
Avatar photo

I am Wonfull, an SEO & GEO expert driving next-gen organic growth. I recently scaled a Middle Eastern media project's organic traffic by 10x in 6 months. As an AI builder, I created seo-audit (delivers a 92-point SEO diagnostic report in 1 minute) and am developing GEOWriter to automate content pipelines via agentic workflows.

Leave A Reply