- The Cloud Cover
- Posts
- Google AI Lands on Oracle Cloud
Google AI Lands on Oracle Cloud
This Week in Cloud — August 21, 2025
Welcome back to The Cloud Cover, the briefing built for architects, builders, and IT leaders. This week, Oracle shakes hands with Google, Microsoft faces hybrid security fallout, and Google bets big on securing the AI future. Let’s get into it.
⚡ The Great Decoupling
The biggest story this week wasn’t a new model or a flashy service—it was a partnership that could realign the cloud AI landscape. Oracle and Google Cloud announced that Google's advanced Gemini models will be offered directly on Oracle Cloud Infrastructure (OCI). More than just another multicloud deal, this is a pragmatic move that signals the potential decoupling of the AI model from the underlying cloud infrastructure.
For several years now, the hyperscalers have been building walled gardens, bundling their premier AI models with their own iron. The assumption was that to get the best AI, you had to use their stack. Oracle's strategy makes a bold counter-bet: that foundational models will eventually become commodities, and the winning platform will be the one that offers the most choice and best price-performance for running any model.
This partnership allows a customer to leverage Google's state-of-the-art AI while keeping their data and applications on OCI, which might be preferable for cost, performance, or existing database integrations. It puts direct pressure on the tightly integrated ecosystems of Microsoft/OpenAI and AWS/Bedrock, potentially shifting the competitive focus back to the fundamental, and less glamorous, aspects of at-scale computing: silicon, networking, and cost-efficiency.
The implications are significant. If this trend accelerates, your primary decision may no longer be "which cloud has the best AI models?" but "which cloud is the best place to run the AI model I choose?". This could mark a major inflection point in the market, forcing hyperscalers to compete on infrastructure quality and openness rather than the exclusivity of their models
🔍 The Rundown
Next-Gen Intel-Powered Instances: AWS announced the general availability of its new Amazon EC2 R8i instances, powered by Intel's latest Xeon 6 processors. The result of a multi-year collaboration, these instances provide up to 20% better performance and include integrated accelerators for AI inference and encryption, offering a more cost-effective option for workloads that don't require a full-blown GPU.
Developer Experience Improvements: A series of smaller, targeted updates were released, aimed at improving the developer experience. These include expanded support for the Cilium CNI in Amazon EKS for hybrid environments, more flexibility in Amazon DynamoDB's throughput modes, and upgrades to the Amazon Q AI assistant.
Gartner Magic Quadrant Leadership: Reinforcing its market position, AWS was named a Leader in the 2025 Gartner Magic Quadrant for Strategic Cloud Platform Services for the fifteenth consecutive year. The report again placed AWS highest on the "Ability to Execute" axis, a powerful validation for risk-averse enterprise buyers.
Critical Exchange Server Vulnerability: The August "Patch Tuesday" release disclosed a critical vulnerability (CVE-2025-53786) in on-premises Exchange Server. The flaw allows an attacker to pivot from a local server directly into an organization's cloud identity fabric, turning an on-prem breach into a full-scale cloud compromise. The multi-step remediation is complex, increasing the risk of incomplete patching across the estimated 29,000 vulnerable servers facing the internet.
GPT-5 General Availability: Microsoft made OpenAI's new flagship model, GPT-5, generally available within the Azure AI Foundry. This release reinforces the deep strategic partnership between the two companies and solidifies Azure's position as the premier cloud for accessing OpenAI's most advanced models.
Data Platform Updates: Despite security challenges, Azure rolled out several updates to its data platform. Key releases include the GA of Azure Managed Instance for Apache Cassandra v5.0 and the GA of the Azure Databricks connector for Microsoft Power Platform.
Nuclear Power Partnership: Google announced a landmark agreement to procure up to 50 MW of 24/7 carbon-free nuclear power from Kairos Power for its Tennessee and Alabama data centers. This is a decisive strategic move to secure the massive, stable energy supply required for the next generation of AI, insulating Google from energy market volatility and building a deep, structural advantage over competitors.
AI Agent Security Framework: At its Security Summit, Google unveiled a new framework to address threats posed by autonomous AI agents. New capabilities in Security Command Center can automatically discover and inventory AI agents, while Model Armor is being extended to protect against prompt injection and data disclosure in agent-based systems.
GKE Performance Enhancements: Google Kubernetes Engine (GKE) made its Topology Manager generally available, which optimizes performance for latency-sensitive AI and HPC workloads. GKE also now offers a preview of multi-subnet support for clusters, removing a key architectural limitation for large-scale environments.
CloudWorld Rebrands: Underscoring a laser focus on artificial intelligence, Oracle announced it is rebranding its flagship conference from Oracle CloudWorld to Oracle AI World. This is an aggressive attempt to reshape market perception and declare that AI is not just a feature of its cloud, but its entire cloud strategy.
AI Growth and Risk: This all-in AI strategy comes with significant risks, according to a third-party financial analysis. While IaaS revenue grew an impressive 45% year-over-year, fueled by demand for AI infrastructure, the massive capital expenditures required have pushed Oracle's free cash flow into the negative, creating anxiety among investors.
📈 Trending Now: Securing the Agent Legions
This week revealed a divergence in security strategy among the hyperscalers, exposing a new competitive front: securing the future of AI itself. As organizations move from simply using AI models as tools to deploying fleets of autonomous agents, an entirely new class of security risks emerges. The central question for leaders is no longer just about protecting infrastructure, but about governing agents.
On one side, Microsoft is consumed with the critical but reactive task of securing the complex and brittle connections of the hybrid present. The severe Exchange vulnerability is a perfect illustration: a flaw in a legacy on-premises system that punches a hole directly into the cloud identity fabric. Microsoft's focus is necessarily on patching the risks of architectures built over the last decade.
In contrast, Google is proactively building the security framework for the agentic future. Its announcements at the Security Summit—tools to discover AI agents, extend protections like Model Armor to agent prompts, and even build an "Agentic SOC"—are not about patching old systems. They are about creating new governance and protection models for a world where autonomous code executes complex tasks. This positions Google as a thought leader on next-generation threats and forces a critical question upon every CISO and architect: Is your security strategy focused on protecting the past or preparing for the future?
📅 Event Radar
4
Registration still open
15-18
Attend the largest Fabric conference in Europe
8-10
Speakers list now available
👋 Until Next Week
This week was a clear signal that the cloud market is entering a new phase of deep, structural competition. The battles are being fought over foundational hardware, strategic energy sourcing, and the security paradigms for an AI-driven future. As the AI model layer begins to unbundle from the infrastructure layer, the choices for builders become more complex, but also more powerful. Keep an eye on how—and if—the other major clouds respond to the new era of AI model neutrality.
Do you enjoy these emails? Your friends and colleagues might, too! Help us grow the cloud community by sharing the newsletter with others.