Nvidia stock slipped on Wednesday as investors reacted to fresh competitive pressure from Amazon’s new Trainium 3 artificial intelligence chip, the latest sign that major cloud providers are accelerating efforts to develop their own AI silicon. At the time of publishing, the Nvidia stock was down 0.6% to trade at around $180.34.Amazon unveiled Trainium 3 on Tuesday, pitching it as a cost-efficient alternative for training and operating AI models. The company said the new chip can reduce AI training and inference costs by up to 50% compared with systems using equivalent GPUs — the category dominated by Nvidia.Amazon also said it plans to use Nvidia’s NVLink Fusion technology in its future AI computing infrastructure, integrating it with the forthcoming Trainium4 chip.“With Nvidia NVLink Fusion coming to AWS Trainium4, we’re unifying our scale-up architecture with AWS’s custom silicon to build a new generation of accelerated platforms,” Nvidia CEO Jensen Huang said. “Together, NVIDIA and AWS are creating the compute fabric for the AI industrial revolution.”Nvidia stresses long-term demand despite competitive movesNvidia is working to reassure investors that it can maintain dominant market share even as Amazon, Google and other hyperscalers expand use of in-house silicon. The company’s neutral position in the market — as a supplier rather than a direct cloud-services competitor — remains a strategic advantage, as some technology giants may prefer not to depend heavily on rival hardware.Nvidia CFO Colette Kress said Tuesday that AI models trained on its new Blackwell chips will begin emerging in about six months. She noted the company has $500 billion in bookings for Blackwell and Rubin chips through 2026, excluding an upcoming deal with OpenAI that has yet to be finalised.Separately, European AI start-up Mistral said it trained its next-generation models on Nvidia hardware. The companies highlighted that Mistral’s Large 3 model achieved a tenfold performance improvement on Nvidia’s GB200 NV72 server racks compared with the previous H200 generation.Competition landscape starting to get intenseWhile Oracle Cloud Infrastructure’s earlier adoption of more than 50,000 AMD chips signalled growing interest in non-Nvidia solutions, the competitive pressure now is coming most visibly from Amazon Web Services. With Trainium 3, AWS has taken a significant step toward deepening its in-house AI silicon strategy. The chip is said to offer four times the performance of its predecessor and reduces energy consumption by 40%, underscoring AWS’s ambition to optimise its data centres around its own hardware.Google, meanwhile, is extending more aggressive outreach for its Tensor Processing Units, promoting TPUs to major customers such as Meta. The push suggests Google is seeking to expand TPU adoption among hyperscalers that have traditionally relied on Nvidia GPUs.The combined efforts of Amazon, Google and AMD signal a broadening competitive landscape in the AI hardware sector. While Nvidia remains the clear leader, its largest customers are now among its most visible challengers — each moving to reduce reliance on external suppliers and expand control over their AI infrastructure.The post Nvidia stock continues slide: is the AI darling's moat drying up as competition intensifies? appeared first on InvezzNvidia stock slipped on Wednesday as investors reacted to fresh competitive pressure from Amazon’s new Trainium 3 artificial intelligence chip, the latest sign that major cloud providers are accelerating efforts to develop their own AI silicon. At the time of publishing, the Nvidia stock was down 0.6% to trade at around $180.34.Amazon unveiled Trainium 3 on Tuesday, pitching it as a cost-efficient alternative for training and operating AI models. The company said the new chip can reduce AI training and inference costs by up to 50% compared with systems using equivalent GPUs — the category dominated by Nvidia.Amazon also said it plans to use Nvidia’s NVLink Fusion technology in its future AI computing infrastructure, integrating it with the forthcoming Trainium4 chip.“With Nvidia NVLink Fusion coming to AWS Trainium4, we’re unifying our scale-up architecture with AWS’s custom silicon to build a new generation of accelerated platforms,” Nvidia CEO Jensen Huang said. “Together, NVIDIA and AWS are creating the compute fabric for the AI industrial revolution.”Nvidia stresses long-term demand despite competitive movesNvidia is working to reassure investors that it can maintain dominant market share even as Amazon, Google and other hyperscalers expand use of in-house silicon. The company’s neutral position in the market — as a supplier rather than a direct cloud-services competitor — remains a strategic advantage, as some technology giants may prefer not to depend heavily on rival hardware.Nvidia CFO Colette Kress said Tuesday that AI models trained on its new Blackwell chips will begin emerging in about six months. She noted the company has $500 billion in bookings for Blackwell and Rubin chips through 2026, excluding an upcoming deal with OpenAI that has yet to be finalised.Separately, European AI start-up Mistral said it trained its next-generation models on Nvidia hardware. The companies highlighted that Mistral’s Large 3 model achieved a tenfold performance improvement on Nvidia’s GB200 NV72 server racks compared with the previous H200 generation.Competition landscape starting to get intenseWhile Oracle Cloud Infrastructure’s earlier adoption of more than 50,000 AMD chips signalled growing interest in non-Nvidia solutions, the competitive pressure now is coming most visibly from Amazon Web Services. With Trainium 3, AWS has taken a significant step toward deepening its in-house AI silicon strategy. The chip is said to offer four times the performance of its predecessor and reduces energy consumption by 40%, underscoring AWS’s ambition to optimise its data centres around its own hardware.Google, meanwhile, is extending more aggressive outreach for its Tensor Processing Units, promoting TPUs to major customers such as Meta. The push suggests Google is seeking to expand TPU adoption among hyperscalers that have traditionally relied on Nvidia GPUs.The combined efforts of Amazon, Google and AMD signal a broadening competitive landscape in the AI hardware sector. While Nvidia remains the clear leader, its largest customers are now among its most visible challengers — each moving to reduce reliance on external suppliers and expand control over their AI infrastructure.The post Nvidia stock continues slide: is the AI darling's moat drying up as competition intensifies? appeared first on Invezz

Nvidia stock continues slide: is the AI darling’s moat drying up as competition intensifies?

3 min read

Nvidia stock slipped on Wednesday as investors reacted to fresh competitive pressure from Amazon’s new Trainium 3 artificial intelligence chip, the latest sign that major cloud providers are accelerating efforts to develop their own AI silicon.

At the time of publishing, the Nvidia stock was down 0.6% to trade at around $180.34.

Amazon unveiled Trainium 3 on Tuesday, pitching it as a cost-efficient alternative for training and operating AI models.

The company said the new chip can reduce AI training and inference costs by up to 50% compared with systems using equivalent GPUs — the category dominated by Nvidia.

Amazon also said it plans to use Nvidia’s NVLink Fusion technology in its future AI computing infrastructure, integrating it with the forthcoming Trainium4 chip.

“With Nvidia NVLink Fusion coming to AWS Trainium4, we’re unifying our scale-up architecture with AWS’s custom silicon to build a new generation of accelerated platforms,” Nvidia CEO Jensen Huang said.

“Together, NVIDIA and AWS are creating the compute fabric for the AI industrial revolution.”

Nvidia stresses long-term demand despite competitive moves

Nvidia is working to reassure investors that it can maintain dominant market share even as Amazon, Google and other hyperscalers expand use of in-house silicon.

The company’s neutral position in the market — as a supplier rather than a direct cloud-services competitor — remains a strategic advantage, as some technology giants may prefer not to depend heavily on rival hardware.

Nvidia CFO Colette Kress said Tuesday that AI models trained on its new Blackwell chips will begin emerging in about six months.

She noted the company has $500 billion in bookings for Blackwell and Rubin chips through 2026, excluding an upcoming deal with OpenAI that has yet to be finalised.

Separately, European AI start-up Mistral said it trained its next-generation models on Nvidia hardware.

The companies highlighted that Mistral’s Large 3 model achieved a tenfold performance improvement on Nvidia’s GB200 NV72 server racks compared with the previous H200 generation.

Competition landscape starting to get intense

While Oracle Cloud Infrastructure’s earlier adoption of more than 50,000 AMD chips signalled growing interest in non-Nvidia solutions, the competitive pressure now is coming most visibly from Amazon Web Services.

With Trainium 3, AWS has taken a significant step toward deepening its in-house AI silicon strategy.

The chip is said to offer four times the performance of its predecessor and reduces energy consumption by 40%, underscoring AWS’s ambition to optimise its data centres around its own hardware.

Google, meanwhile, is extending more aggressive outreach for its Tensor Processing Units, promoting TPUs to major customers such as Meta.

The push suggests Google is seeking to expand TPU adoption among hyperscalers that have traditionally relied on Nvidia GPUs.

The combined efforts of Amazon, Google and AMD signal a broadening competitive landscape in the AI hardware sector.

While Nvidia remains the clear leader, its largest customers are now among its most visible challengers — each moving to reduce reliance on external suppliers and expand control over their AI infrastructure.

The post Nvidia stock continues slide: is the AI darling's moat drying up as competition intensifies? appeared first on Invezz

Market Opportunity
null Logo
null Price(null)
--
----
USD
null (null) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.
Tags:

You May Also Like

XRP Enters ‘Washout Zone,’ Then Targets $30, Crypto Analyst Says

XRP Enters ‘Washout Zone,’ Then Targets $30, Crypto Analyst Says

XRP has entered what Korean Certified Elliott Wave Analyst XForceGlobal (@XForceGlobal) calls a “washout” phase inside a broader Elliott Wave corrective structure
Share
NewsBTC2026/02/05 08:00
Republicans are 'very concerned about Texas' turning blue: GOP senator

Republicans are 'very concerned about Texas' turning blue: GOP senator

While Republicans in the U.S. House of Representatives have a razor-thin with just a four-seat advantage, their six-seat advantage in the U.S. Senate is seen as
Share
Alternet2026/02/05 08:38
Headwind Helps Best Wallet Token

Headwind Helps Best Wallet Token

The post Headwind Helps Best Wallet Token appeared on BitcoinEthereumNews.com. Google has announced the launch of a new open-source protocol called Agent Payments Protocol (AP2) in partnership with Coinbase, the Ethereum Foundation, and 60 other organizations. This allows AI agents to make payments on behalf of users using various methods such as real-time bank transfers, credit and debit cards, and, most importantly, stablecoins. Let’s explore in detail what this could mean for the broader cryptocurrency markets, and also highlight a presale crypto (Best Wallet Token) that could explode as a result of this development. Google’s Push for Stablecoins Agent Payments Protocol (AP2) uses digital contracts known as ‘Intent Mandates’ and ‘Verifiable Credentials’ to ensure that AI agents undertake only those payments authorized by the user. Mandates, by the way, are cryptographically signed, tamper-proof digital contracts that act as verifiable proof of a user’s instruction. For example, let’s say you instruct an AI agent to never spend more than $200 in a single transaction. This instruction is written into an Intent Mandate, which serves as a digital contract. Now, whenever the AI agent tries to make a payment, it must present this mandate as proof of authorization, which will then be verified via the AP2 protocol. Alongside this, Google has also launched the A2A x402 extension to accelerate support for the Web3 ecosystem. This production-ready solution enables agent-based crypto payments and will help reshape the growth of cryptocurrency integration within the AP2 protocol. Google’s inclusion of stablecoins in AP2 is a massive vote of confidence in dollar-pegged cryptocurrencies and a huge step toward making them a mainstream payment option. This widens stablecoin usage beyond trading and speculation, positioning them at the center of the consumption economy. The recent enactment of the GENIUS Act in the U.S. gives stablecoins more structure and legal support. Imagine paying for things like data crawls, per-task…
Share
BitcoinEthereumNews2025/09/18 01:27