🏛️ Official Updates

A new way to explore the web with AI Mode in Chrome

Google’s AI Mode in Chrome introduces a new way to explore the web, transforming how users interact with information online.

By opening webpages side-by-side with AI Mode, we can now access and engage with content more fluidly, without switching tabs. This upgrade allows for deeper exploration and comparison of details while maintaining the context of our search. We can ask follow-up questions in real-time, making the process of discovering and learning from the web more efficient and focused.

🔗 Google The Keyword


🤖 GEO·SEO Highlights

How HubSpot became the #1 CRM in AI search [A case study]

HubSpot became the #1 CRM in AI search by implementing a three-pillar AEO strategy that increased their AI visibility by 1,850% and citations by 433%.

🔗 HubSpot Marketing


Astro SEO Guide – From Yoast SEO Founder

This Astro SEO guide by Yoast SEO founder Joost de Valk provides a comprehensive technical framework for optimizing Astro sites.

The guide covers six core components: a unified SEO component that handles all metadata, auto-generated Open Graph images at 1200×675 resolution, build-time validation checking H1s, duplicate titles, schema data, image alt text, metadata length, and internal links, structured data implementation with JSON-LD graphs, IndexNow integration for search engine notifications, and keyphrase optimization strategies. The author demonstrates how static HTML on a CDN offers superior SEO advantages over traditional CMS platforms, eliminating theme conflicts and plugin interference while maintaining full control over output.

🔗 Hacker News (SEO)


Show HN: TopRank – Open-Source Claude Code Skills for SEO and Google Ads

Show HN: Toprank is an open-source Claude Code tool that provides data-driven SEO and Google Ads optimization with automated execution capabilities.

This free tool connects directly to Google Search Console and Google Ads, analyzing your traffic, identifying ranking issues, and detecting wasted ad spend. When given repository access, it automatically fixes problems by rewriting meta tags, fixing headings, and adding structured data. The tool answers critical questions like “Am I wasting money on ads?” and “Why did my traffic drop?” with specific insights and actionable recommendations. Installation takes just 30 seconds, making it accessible for immediate use.

🔗 Hacker News (SEO)


How to do a website audit in 2026 (+ free tracker)

I conducted a comprehensive website audit in 2026 that revealed critical issues affecting both search visibility and user experience.

The audit covered technical SEO, AI optimization, content quality, and conversion rate optimization across all major site sections. Using a structured approach with the free Semrush audit tracker, I identified specific problems that needed immediate attention, from crawlability issues preventing proper indexing to conversion bottlenecks hurting revenue. This systematic process helped prioritize fixes that would have the biggest impact on performance metrics.

🔗 Semrush Blog


The Complete AI Research Workflow: From Prompt Discovery to Content Creation

I discovered a complete AI research workflow that transforms how we track and optimize content for AI-driven search. This five-step process, developed by Moz Pro, starts with prompt discovery and ends with continuous optimization across major AI models like ChatGPT, Gemini, and Claude.

The workflow begins with researching conversational prompts that matter to your business using Moz Pro’s Prompt Suggestions tool. This tool analyzes how people naturally discuss your brand in AI interactions and provides organic search metrics alongside suggested prompts. By combining AI visibility data with traditional SEO metrics like search volume and difficulty, you can make informed decisions about which prompts to target.

Next, you track your brand’s presence across AI-generated responses, identifying opportunities to create or update content based on competitor analysis and organic data. The system allows you to monitor progress over time and adjust strategies as AI search continues evolving. This approach recognizes that SEO and GEO (GEO) overlap significantly—approximately 90% according to industry expert Lily Ray—meaning content optimized for AI search also strengthens organic search performance.

The complete AI research workflow provides a tangible way to compete in the new evolution of search where users increasingly start their journey in large language models rather than traditional search engines.

🔗 Moz Blog


What Is Answer Engine Optimization? And How to Do It

Answer engine optimization (AEO) is the process of optimizing your brand to appear in AI-generated answers across platforms like Google AI Mode and ChatGPT. I recommend focusing on AEO because AI search tools are now generating responses that often replace traditional search results, and brands that appear in these AI answers gain significant visibility and trust with users.

The key difference between AEO and traditional SEO is that AEO targets how AI systems find, evaluate, and present information, rather than just ranking in search engine results pages. While SEO focuses on keywords and backlinks, AEO emphasizes getting positive brand mentions in reputable publications, creating AI-friendly content, and building trust signals that AI systems recognize.

To optimize for answer engines, I suggest gaining brand mentions across trusted sources like news articles, blogs, and industry publications. This helps AI systems recognize your brand as a credible source. Additionally, publish original, useful content that directly answers common questions in your industry, using clear, structured formats that AI can easily parse and cite.

The business impact is significant – Semrush’s research shows AI search visitors are 4.4 times more valuable than traditional organic search visitors based on conversion rates. By investing in AEO now, you’re positioning your brand to be included in the conversations AI systems have with users, which is becoming increasingly important as more people rely on AI-generated answers rather than clicking through to websites.

🔗 Semrush Blog


Travel Marketing: How to Compete and Future-Proof in 2026

In 2026, travel marketing must evolve to compete in an AI-driven landscape, where digital PR, personalized stories, and human-first narratives are key.

Chloe Osunsami, Head of Digital PR at Aira, highlights that 84% of travelers use AI for trip planning, making brand visibility in AI search crucial. Digital PR is no longer just about links but also brand mentions, which correlate strongly with AI visibility. To stand out, travel brands should focus on personalized itineraries, human-first narratives, an “always on” approach with data-led campaigns, robust data stories, and tailored outreach to journalists. These strategies ensure long-term success in a competitive market.

🔗 Moz Blog


GEO: A Practical Guide

GEO (GEO) is essential for brands to appear in AI-powered search results, as AI-generated answers now shape how people discover and choose products.

I recommend focusing on GEO because AI platforms like ChatGPT and Google AI Overviews reach billions of users, influencing purchasing decisions through agentic search and agentic commerce. Unlike traditional SEO that targets search rankings, GEO optimizes for inclusion in AI-generated responses, requiring content that is clear, extractable, and frequently updated. My experience shows that investing in GEO delivers organic visibility without ads, attracts qualified traffic, ensures 24/7 brand presence, and builds industry credibility. To succeed, consistently publish relevant content, make it easily accessible to AI systems, and earn credible mentions across the web—these strategies overlap significantly with effective SEO practices.

🔗 Semrush Blog


Local Keyword Research for SEO: What It Is & How to Do It

Local keyword research is essential for improving your business’s visibility in local search results and Google Maps. This guide shows you how to identify high-intent local keywords that drive both online and foot traffic to your business.

Effective local keyword research helps you understand how customers search for products and services in specific areas, from entire cities down to neighborhoods. By targeting these location-specific terms, you can optimize your website and Google Business Profile to appear in both traditional search results and the valuable Local Pack.

The research process involves identifying different types of local keywords – from city-level service terms to “near me” searches and ZIP code-specific queries. Understanding the difference between implicit and explicit local keywords is crucial, as Google treats these differently based on user location and search intent. This knowledge helps you create content that ranks well for both types of searches while improving your overall local SEO strategy.

🔗 Semrush Blog


How to Create an Effective SEO Report in 2026 (+ Free Template)

To create effective SEO reports in 2026, we need to track both traditional search metrics and AI visibility data.

The article recommends including organic clicks, AI visibility, click-through rates, conversion rates, keyword rankings, backlinks, and site health as foundational metrics. Using Semrush’s free template and automated reporting tools can streamline the process while providing actionable insights for clients and stakeholders. The key is selecting metrics that align with your specific SEO goals and presenting them through clear data visualizations that even non-technical readers can understand.

🔗 Semrush Blog


10 Best Website Traffic Analysis Tools for 2026 (Features & Pricing)

The 10 best website traffic analysis tools for 2026 provide comprehensive insights into site performance and user behavior, helping you optimize your online presence.

I’ve tested these tools extensively, and each offers unique features for tracking metrics like views, bounce rates, and traffic sources. From free options like Google Analytics 4 to premium solutions like Semrush’s Traffic Analytics and Organic Traffic Insights, these tools enable you to identify successful marketing campaigns, improve low-engagement pages, and understand audience demographics. The pricing ranges from free to enterprise-level plans, making them accessible for businesses of all sizes.

🔗 Semrush Blog


The 11 Best Local SEO Tools in 2026

I’ve analyzed the article and will provide a summary in geo-writer style that meets your requirements.

The 11 best local SEO tools in 2026 include Google Business Profile, GBP Optimization, and Listing Management to help businesses dominate local search. I’ve found these tools essential for maintaining visibility across Google Maps, traditional search results, and AI-driven search experiences.

Google Business Profile lets you create and manage your free business listing, which ranks in both Google Search and Maps. When you optimize your profile with complete business information, photos, and customer reviews, you increase your chances of appearing in relevant local searches.

GBP Optimization from Semrush enables you to manage multiple Google Business Profiles from one dashboard, create AI-generated posts, and schedule content updates. This centralized management saves time while maintaining consistent brand messaging across locations.

Listing Management ensures your business information remains consistent across online directories, which builds trust with both search engines and potential customers. The tool automatically distributes accurate NAP (name, address, phone number) data to high-quality directories.

These tools work together to help you show up in AI-generated answers, maintain consistent business information, track rankings across Google Search and Maps, and build trust through customer reviews. By implementing these 11 best local SEO tools, you’ll maintain strong visibility as search behavior continues evolving toward AI-driven discovery.

🔗 Semrush Blog


The 6 Agentic AI Protocols Every SEO Needs to Know

The 6 agentic AI protocols every SEO needs to know are reshaping how AI agents discover, understand, and act on websites. These protocols form a new technical foundation that determines whether AI agents can seamlessly interact with your brand or struggle to extract meaning from your content.

The protocol stack operates across multiple layers: MCP connects agents to external tools and APIs, A2A enables agent-to-agent communication, NLWeb and WebMCP make websites directly queryable, while ACP and UCP power agent-driven commerce. Each protocol serves a distinct purpose, yet they work together to create a unified ecosystem.

MCP, launched by Anthropic in November 2024 and now adopted by OpenAI, Google, and Microsoft, has become the de facto standard with over 10,000 servers. It eliminates the need for custom integrations by providing a universal connector between AI agents and data sources. A2A, introduced by Google in April 2025 with 50+ technology partners, allows AI agents from different vendors to communicate and delegate tasks through standardized “Agent Cards.”

For brands, this means structured data and clean APIs are no longer just SEO best practices—they’re essential for agent compatibility. As multi-agent workflows become standard, your brand may be evaluated across multiple checkpoints before reaching human users. Inconsistent data across sources could filter your brand out entirely.

The bottom line: if you want AI agents to discover, recommend, and interact with your brand effectively, you need to understand and implement these protocols. They represent the new table stakes for visibility in an agent-driven future.

🔗 Backlinko


Your guide to SEO ranking in organic search

This guide to SEO ranking covers the essentials of improving your website’s organic search position. I’ll walk you through what affects your rankings and share practical strategies to boost your results. The same practices that improve traditional search engine rankings also enhance visibility in AI tools.

Your SEO ranking represents a webpage’s organic position in search engine results for specific queries. These rankings are influenced by factors related to your page’s relevance, quality, and usability. When your content ranks higher, you gain increased brand visibility and potential website traffic.

Search engines use complex algorithms to evaluate and rank results based on factors like content relevance, keyword usage, and user intent. The article breaks down three main categories of ranking factors: on-page SEO (content and structure), off-page SEO (external signals like backlinks), and technical SEO (site performance and architecture).

To improve your rankings, focus on incorporating relevant keywords naturally throughout your content, optimizing title tags and meta descriptions, using proper header tags, and adding descriptive alt text for images. The guide emphasizes creating high-quality content that demonstrates Experience, Expertise, Authoritativeness, and Trustworthiness (E-E-A-T), as Google prioritizes such content when determining rankings.

Remember that ranking for the right keywords matters more than just achieving high positions. Target terms that align with your business goals and capture users who are most likely to convert. By following these guidelines, you’ll be well-positioned to improve your SEO rankings and drive meaningful results for your business.

🔗 Semrush Blog


Why ChatGPT Cites One Page Over Another (Study of 1.4M Prompts)

We studied 1.4 million ChatGPT prompts to discover what makes ChatGPT cites one page over another.

Our research found that 88% of cited URLs come from search results, while Reddit content appears in 67.8% of non-cited URLs despite being heavily retrieved. The title, snippet, and URL serve as gatekeepers before ChatGPT reads your actual content. Pages with higher semantic similarity to queries and human-readable URLs perform better. If you want ChatGPT to cite your content, focus on ranking in search results and optimizing your metadata.

🔗 Ahrefs Blog


Your AI Visibility Strategy Doesn’t Work Outside English

I need to rethink our AI visibility strategy for global markets because the current approach only works in English.

Recent data shows that AI platforms vary dramatically by region – in China, Baidu’s ERNIE Bot has 200 million monthly users while ChatGPT isn’t even accessible. South Korea’s Naver dominates with 62.86% market share and deploys AI Briefing for 20% of searches. Europe has launched Mistral in France, Aleph Alpha in Germany, and OpenEuroLLM covering 24 languages. The Middle East is building sovereign AI with UAE’s Falcon Arabic and Saudi Arabia’s HUMAIN. India’s BharatGen and Southeast Asia’s SEA-LION support regional languages. Our English-only content architecture simply doesn’t exist in these ecosystems. I must adapt our AI visibility strategy to match the actual platforms and languages our target customers use in each market.

🔗 Search Engine Journal


Machine-First Architecture: AI Agents Are Here And Your Website Isn’t Ready

Machine-first architecture is here, and your website isn’t ready, says Slobodan Manic of the No Hacks Podcast. AI agents are already shipping in browsers used by billions of people, with every major tech company launching either AI-integrated browsers or extensions that act on behalf of users.

From my testing and research, I’ve found that websites are nowhere near ready for this shift because structurally almost every website is broken for AI agent interaction. Claude for Chrome can navigate websites, fill forms, and perform multi-step operations. Google’s Gemini in Chrome includes agentic browsing capabilities that can act on webpages automatically. OpenClaw connects large language models directly to browsers and system tools to execute tasks autonomously.

What changed in the last six to nine months is that AI has shifted from waiting for us to come to it, to coming to us and meeting us where we are. When ChatGPT launched in 2023, we asked AI questions. Now agents represent an even bigger shift where AI can complete tasks on our behalf and run complex systems.

Most websites aren’t built or ready for this agentic world. Some experts predict websites will become optional for end users, with pages built by machines for machines and interaction happening through closed system interfaces. Google recently received a patent allowing AI to rewrite landing pages if they’re not good enough, and Gemini browsing in Chrome creates an end-to-end AI system where humans wait for results.

The timeline for this becoming reality is within a year for basic functionality, with 2027 being a realistic target for widespread adoption. This represents a fundamental shift in how we need to think about website architecture and user experience.

🔗 Search Engine Journal


ChatGPT Often Retrieves But Rarely Cites Reddit Pages, Data Shows

ChatGPT often retrieves but rarely cites Reddit pages, new data from Ahrefs shows.

An analysis of 1.4 million ChatGPT prompts found that while Reddit content frequently appears in search results, it is cited in only 1.93% of responses. This suggests Reddit plays a key role in shaping answers but rarely gets direct credit. To improve your chances of being cited, focus on aligning your page titles and URLs with the specific sub-questions ChatGPT generates during its search process. Clear, descriptive URLs also boost citation rates significantly. These insights can help refine your SEO strategy for better visibility in AI-driven search results.

🔗 Search Engine Journal


The Modern SEO Center Of Excellence: Governance, Not Guidelines

The modern SEO Center of Excellence must govern, not just advise.

Traditional SEO CoEs fail because they lack authority over the systems that determine search performance. Modern search evaluates whether organizations present themselves as coherent systems, not isolated pages. AI-driven discovery requires consistent structure, entity definitions, and machine-readable data across all digital assets. Without centralized governance, templates evolve independently, content fragments, and structured data implementations vary, causing search engines to exclude unreliable sources. The future of SEO CoEs isn’t about sharing knowledge more efficiently—it’s about controlling standards before digital assets are created. Governance transforms SEO from optional recommendations into required infrastructure that ensures discoverability and prevents exclusion from AI-driven search results.

🔗 Search Engine Journal


Google Just Made It Easy For SEOs To Kick Out Spammy Sites

Google just made it easier for SEOs to remove spammy sites from search results.

I can now report spam directly, and Google may take manual action against these sites. This is a significant change from the previous policy, where reports were only used to improve spam detection systems. Now, when I submit a spam report, Google might issue a manual action, which could result in the site being deindexed. This gives me a new tool to combat spam in search results.

🔗 Search Engine Journal


Share.
Avatar photo

I am Wonfull, an SEO & GEO expert driving next-gen organic growth. I recently scaled a Middle Eastern media project's organic traffic by 10x in 6 months. As an AI builder, I created seo-audit (delivers a 92-point SEO diagnostic report in 1 minute) and am developing GEOWriter to automate content pipelines via agentic workflows.

Leave A Reply