
AWS CEO Matt Garman stated that Amazon’s recent $50 billion investment in OpenAI, following its longstanding partnership that included $8 billion in Anthropic, reflects the type of conflict of interest the cloud behemoth is accustomed to managing.
Garman has been with Amazon since he interned there during business school in 2005, prior to the debut of AWS in 2006, he shared with the HumanX conference audience happening this week in San Francisco.
When questioned about the intrinsic conflict of collaborating closely with two AI model firms that are intense (and, arguably, sometimes trivial) competitors, he claimed it’s not an issue. Since AWS frequently competes with its partners, it possesses ample direct experience with such rivalry, he elaborated.
During AWS’s initial years, it recognized it couldn’t create every cloud solution independently, so the division allied with others.
“We also understood that we might compete with our partners, as technology is intertwined,” Garman reflected. “Thus, for a considerable time, we’ve developed the capability to market alongside our partners,” he continued. “However, we might even have first-party products that rival them, and that’s acceptable, and we’ve assured them that we won’t give ourselves an undue competitive edge.”
Nowadays, the public is familiar with Amazon competing against those who offer services on its cloud. Even one of AWS’s major competitors, Oracle, provides its database and additional services on AWS. However, this was a groundbreaking concept back in 2006, when tech partners took care not to compete with those that contributed to their success.
Nonetheless, Amazon is hardly pioneering the abandonment of investor loyalty and conflict-of-interest principles in the cutthroat, profit-driven realm of AI. When Anthropic revealed its latest $30 billion funding round in February, it included at least twelve investors who were also supporting OpenAI, including OpenAI’s primary cloud partner, Microsoft.
Techcrunch event
San Francisco, CA
|
October 13-15, 2026
For AWS, investing significantly in OpenAI to access its model for its clientele (and as a technology development ally) was nearly a matter of survival. Both models were already accessible on Microsoft’s cloud, AWS’s most significant competitor.
The cloud titans are also striving to maintain visibility by providing AI model-routing services. These services enable their clients to automatically deploy different models for various functions, aiming to enhance performance and cut costs. As Garman illustrated, one model may be optimal for planning, another for reasoning, and a more economical model for simpler tasks, like code completion. “I believe that is where the future will lead,” Garman remarked.
That is also how Amazon, and Microsoft for that matter, will integrate their proprietary models into use — that recurring competition-with-your-partners scenario.
All’s fair in love and AI these days.

