Official Publication: B2AI Protocol Achieves 91% Compute and 79% Egress Reduction

By Pablo Gil | May 12, 2026 | Milestone & Open Source Release

The hypothesis has been empirically proven. After weeks of rigorous stress testing across our decentralized edge network, the preliminary results of the B2AI Protocol are now officially published on Zenodo (CERN).

Zenodo Publication (DOI)

The complete whitepaper, detailing the topological deployment, the "Dark Bot" evasion tactics, and the mitigation of Scope 3 emissions, is available in Open Access.

Read the paper: https://doi.org/10.5281/zenodo.20128943

Phase 3 Results: 91% Compute Optimization

During our Phase 3 Critical Stress Test, unoptimized control nodes asphyxiated under simulated LLM crawler loads, reaching 37.52 ms of CPU processing time. In stark contrast, the nodes protected by the B2AI protocol—dispatching lightweight JSON-LD to synthetic agents—operated flawlessly at just 3.26 ms.

This empirical data confirms a 91% reduction in hardware computational consumption, drastically mitigating the energy waste associated with generative AI scraping.

Network Egress (Bandwidth): Reduced from 2.0 GB to 431 MB (79% Saving).

By "cleaning" the traffic and avoiding the transfer of heavy visual assets to datacenter-based crawlers, we prevent the waste of over 1.5 GB of bandwidth in a single high-stress event.

Releasing B2AI V3.5 (Production Balance)

We are officially open-sourcing the Version 3.5 Router. This iteration introduces a critical production balance: it aggressively hunts masked datacenter servers using our optimized ASN Net, but maintains safe tolerances for corporate VPNs (like Zscaler or Akamai) to ensure zero impact on legitimate human traffic.

// =====================================================================
//  5. THE B2AI V3.5 DETECTOR (PRODUCTION BALANCE) 
// =====================================================================
const userAgent = request.headers.get('User-Agent') || '';
const asnOrg = (request.cf && request.cf.asOrganization) ? request.cf.asOrganization : '';

// 1. Basic ID + Scraping Tools (Maintained strictness)
const botRegex = /bot|crawler|spider|claude|gpt|applebot|bing|yandex|baidu|python|curl|wget|http-client|headless|puppeteer|playwright|selenium|lighthouse|inspect|scanner|audit/i;

// 2. Optimized ASN Net (Hunts servers, respects corporations like Zscaler/Akamai)
const datacenterRegex = /amazon|aws|google|microsoft|azure|digitalocean|hetzner|ovh|linode|contabo|alibaba|tencent|hosting|cloud|datacenter|vultr|oracle|m247|leaseweb|choopa|pwn|stark/i;

// 3. FINAL VERDICT (Asynchronous infrastructure evaluation and declaration)
const isBot = botRegex.test(userAgent) || 
               userAgent.trim() === '' || 
               datacenterRegex.test(asnOrg);

if (isBot) {
  // Abort HTML and deliver the useful semantic payload
  return new Response(JSON.stringify(jsonLdData, null, 2), {
    headers: { 
      "Content-Type": "application/json;charset=UTF-8",
      "Cache-Control": "public, max-age=86400"
    }
  });
}

Building Green AI Infrastructure: The Lab Notes

By Pablo Gil | May 10, 2026 | Live Experiment Log & Open Source

The AI revolution has a massive, undocumented energy leak: the friction of legacy web crawling. Forcing LLMs to download visual code (HTML/CSS) just to extract semantic data is destroying corporate ESG goals. We are testing a solution in real-time.

The Premise: Business-to-AI (B2AI) Routing

Within the Project EOS framework, we are conducting a live A/B test across multiple edge nodes. The hypothesis is simple: if we can detect an AI crawler at the network edge and serve it a pure JSON-LD vector instead of the visual website, we can drastically cut the digital carbon footprint (Scope 3 emissions) associated with RAG and LLM training.

Phase 1 Results: The 34% Baseline

Status: Completed

Our initial deployment utilized standard 'User-Agent' detection (identifying known bots like ClaudeBot or Googlebot). The data confirmed our theory, yielding a 34% reduction in bandwidth consumption per request compared to the unoptimized legacy node.

The Challenge: Dark Bots and Spoofers

A 34% reduction is a significant optimization, but it falls short of our architectural goal. Telemetry analysis revealed why: the internet is full of "Dark Bots". These scrapers actively spoof their identities, claiming to be human browsers (e.g., Chrome on Windows) to bypass basic filters, forcing our edge servers to deliver the heavy, energy-intensive visual payload.

Phase 2: Open Sourcing the Level 3 Router

To achieve true ESG compliance, we cannot rely on what a bot claims to be; we must look at where it lives. We have upgraded the B2AI Router to Level 3 Detection by analyzing Autonomous System Numbers (ASN) via Cloudflare's edge telemetry.

Because climate responsibility cannot wait for corporate paywalls, I am open-sourcing the core logic of our Level 3 Filter. Any developer can implement this in minutes to stop datacenter spoofers from wasting their server's energy:

// =====================================================================
// B2AI DETECTOR (LEVEL 3: DATACENTER / ASN VALIDATION) 
// =====================================================================
const userAgent = request.headers.get('User-Agent') || '';
const asnOrg = (request.cf && request.cf.asOrganization) ? request.cf.asOrganization : '';

// 1. Basic ID: Catch official bots and lazy scrapers
const botRegex = /bot|crawler|spider|claude|perplexity|python|curl|wget|http-client|headless|puppeteer|playwright|postman|slurp|yandex/i;

// 2. The Datacenter Trap: Humans use residential ISPs. If it comes from a cloud provider, it's a machine.
const datacenterRegex = /amazon|aws|google cloud|google inc|microsoft|azure|digitalocean|hetzner|ovh|linode|contabo|alibaba|tencent|hosting|cloud|datacenter/i;

// 3. Final Verdict: It's a bot if the ID says so, if it's empty, OR if the network cable comes from a server farm.
const isBot = botRegex.test(userAgent) || userAgent.trim() === '' || datacenterRegex.test(asnOrg);

if (isBot) {
  // Abort visual HTML. Serve pure JSON-LD vector (4KB).
  return new Response(JSON.stringify(jsonLdData, null, 2), {
    headers: { "Content-Type": "application/json;charset=UTF-8" }
  });
}

The new protocol is live. We are letting the global network stress-test this logic. The telemetry from the coming days will determine if we can push that 34% efficiency rate closer to total elimination of crawling waste.

About the author: Pablo Gil is an independent researcher specializing in algorithmic efficiency and AI infrastructure. Lead R&D in B2AI Infrastructure | LLM Telemetry & Corporate ESG Compliance | Open Source Advocate | NGI Zero Applicant.