Bountiful_Harvest
4 days ago
Cool Beans! Big Tech group’s Annapurna Labs is spending big to build custom chips that lessen its reliance on market leader
https://arstechnica.com/ai/2024/11/amazon-ready-to-use-its-own-ai-chips-reduce-its-dependence-on-nvidia/
Amazon is poised to roll out its newest artificial intelligence chips as the Big Tech group seeks returns on its multibillion-dollar semiconductor investments and to reduce its reliance on market leader Nvidia.
Executives at Amazon’s cloud computing division are spending big on custom chips in the hopes of boosting the efficiency inside its dozens of data centers, ultimately bringing down its own costs as well as those of Amazon Web Services’ customers.
The effort is spearheaded by Annapurna Labs, an Austin-based chip start-up that Amazon acquired in early 2015 for $350 million. Annapurna’s latest work is expected to be showcased next month when Amazon announces widespread availability of ‘Trainium 2,’ part of a line of AI chips aimed at training the largest models.
Trainium 2 is already being tested by Anthropic—the OpenAI competitor that has secured $4 billion in backing from Amazon—as well as Databricks, Deutsche Telekom, and Japan’s Ricoh and Stockmark.
AWS and Annapurna’s target is to take on Nvidia, one of the world’s most valuable companies thanks to its dominance of the AI processor market.
“We want to be absolutely the best place to run Nvidia,” said Dave Brown, vice-president of compute and networking services at AWS. “But at the same time we think it’s healthy to have an alternative.” Amazon said ‘Inferentia,’ another of its lines of specialist AI chips, is already 40 percent cheaper to run for generating responses from AI models.
“The price [of cloud computing] tends to be much larger when it comes to machine learning and AI,” said Brown. “When you save 40 percent of $1,000, it’s not really going to affect your choice. But when you are saving 40 percent on tens of millions of dollars, it does.”
Amazon now expects around $75 billion in capital spending in 2024, with the majority on technology infrastructure. On the company’s latest earnings call, chief executive Andy Jassy said he expects the company will spend even more in 2025.
This represents a surge on 2023, when it spent $48.4 billion for the whole year. The biggest cloud providers, including Microsoft and Google, are all engaged in an AI spending spree that shows little sign of abating.
Amazon, Microsoft, and Meta are all big customers of Nvidia, but are also designing their own data center chips to lay the foundations for what they hope will be a wave of AI growth.
“Every one of the big cloud providers is feverishly moving towards a more verticalized and, if possible, homogenized and integrated [chip technology] stack,” said Daniel Newman at The Futurum Group.
“Everybody from OpenAI to Apple is looking to build their own chips,” noted Newman, as they seek “lower production cost, higher margins, greater availability, and more control.”
“It’s not [just] about the chip, it’s about the full system,” said Rami Sinno, Annapurna’s director of engineering and a veteran of SoftBank’s Arm and Intel.
For Amazon’s AI infrastructure, that means building everything from the ground up, from the silicon wafer to the server racks they fit into, all of it underpinned by Amazon’s proprietary software and architecture. “It’s really hard to do what we do at scale. Not too many companies can,” said Sinno.
After starting out building a security chip for AWS called Nitro, Annapurna has since developed several generations of Graviton, its Arm-based central processing units that provide a low-power alternative to the traditional server workhorses provided by Intel or AMD.
“The big advantage to AWS is their chips can use less power, and their data centers can perhaps be a little more efficient,” driving down costs, said G Dan Hutcheson, analyst at TechInsights. If Nvidia’s graphics processing units are powerful general purpose tools—in automotive terms, like a station wagon or estate car—Amazon can optimize its chips for specific tasks and services, like a compact or hatchback, he said.
So far, however, AWS and Annapurna have barely dented Nvidia’s dominance in AI infrastructure.
Nvidia logged $26.3 billion in revenue for AI data center chip sales in its second fiscal quarter of 2024. That figure is the same as Amazon announced for its entire AWS division in its own second fiscal quarter—only a relatively small fraction of which can be attributed to customers running AI workloads on Annapurna’s infrastructure, according to Hutcheson.
As for the raw performance of AWS chips compared with Nvidia’s, Amazon avoids making direct comparisons and does not submit its chips for independent performance benchmarks.
“Benchmarks are good for that initial: ‘hey, should I even consider this chip,’” said Patrick Moorhead, a chip consultant at Moor Insights & Strategy, but the real test is when they are put “in multiple racks put together as a fleet.”
Moorhead said he is confident Amazon’s claims of a 4-times performance increase between Trainium 1 and Trainium 2 are accurate, having scrutinized the company for years. But the performance figures may matter less than simply offering customers more choice.
“People appreciate all of the innovation that Nvidia brought, but nobody is comfortable with Nvidia having 90 percent market share,” he added. “This can’t last for long.”
© 2024 The Financial Times Ltd. All rights reserved. Not to be redistributed, copied, or modified in any way.
fwb
4 days ago
Thanks..............Here is the story......
Big Tech group’s Annapurna Labs is spending big to build custom chips that lessen its reliance on market leader
https://arstechnica.com/ai/2024/11/amazon-ready-to-use-its-own-ai-chips-reduce-its-dependence-on-nvidia/
Amazon is poised to roll out its newest artificial intelligence chips as the Big Tech group seeks returns on its multibillion-dollar semiconductor investments and to reduce its reliance on market leader Nvidia.
Executives at Amazon’s cloud computing division are spending big on custom chips in the hopes of boosting the efficiency inside its dozens of data centers, ultimately bringing down its own costs as well as those of Amazon Web Services’ customers.
The effort is spearheaded by Annapurna Labs, an Austin-based chip start-up that Amazon acquired in early 2015 for $350 million. Annapurna’s latest work is expected to be showcased next month when Amazon announces widespread availability of ‘Trainium 2,’ part of a line of AI chips aimed at training the largest models.
Trainium 2 is already being tested by Anthropic—the OpenAI competitor that has secured $4 billion in backing from Amazon—as well as Databricks, Deutsche Telekom, and Japan’s Ricoh and Stockmark.
AWS and Annapurna’s target is to take on Nvidia, one of the world’s most valuable companies thanks to its dominance of the AI processor market.
“We want to be absolutely the best place to run Nvidia,” said Dave Brown, vice-president of compute and networking services at AWS. “But at the same time we think it’s healthy to have an alternative.” Amazon said ‘Inferentia,’ another of its lines of specialist AI chips, is already 40 percent cheaper to run for generating responses from AI models.
“The price [of cloud computing] tends to be much larger when it comes to machine learning and AI,” said Brown. “When you save 40 percent of $1,000, it’s not really going to affect your choice. But when you are saving 40 percent on tens of millions of dollars, it does.”
Amazon now expects around $75 billion in capital spending in 2024, with the majority on technology infrastructure. On the company’s latest earnings call, chief executive Andy Jassy said he expects the company will spend even more in 2025.
This represents a surge on 2023, when it spent $48.4 billion for the whole year. The biggest cloud providers, including Microsoft and Google, are all engaged in an AI spending spree that shows little sign of abating.
Amazon, Microsoft, and Meta are all big customers of Nvidia, but are also designing their own data center chips to lay the foundations for what they hope will be a wave of AI growth.
“Every one of the big cloud providers is feverishly moving towards a more verticalized and, if possible, homogenized and integrated [chip technology] stack,” said Daniel Newman at The Futurum Group.
“Everybody from OpenAI to Apple is looking to build their own chips,” noted Newman, as they seek “lower production cost, higher margins, greater availability, and more control.”
“It’s not [just] about the chip, it’s about the full system,” said Rami Sinno, Annapurna’s director of engineering and a veteran of SoftBank’s Arm and Intel.
For Amazon’s AI infrastructure, that means building everything from the ground up, from the silicon wafer to the server racks they fit into, all of it underpinned by Amazon’s proprietary software and architecture. “It’s really hard to do what we do at scale. Not too many companies can,” said Sinno.
After starting out building a security chip for AWS called Nitro, Annapurna has since developed several generations of Graviton, its Arm-based central processing units that provide a low-power alternative to the traditional server workhorses provided by Intel or AMD.
“The big advantage to AWS is their chips can use less power, and their data centers can perhaps be a little more efficient,” driving down costs, said G Dan Hutcheson, analyst at TechInsights. If Nvidia’s graphics processing units are powerful general purpose tools—in automotive terms, like a station wagon or estate car—Amazon can optimize its chips for specific tasks and services, like a compact or hatchback, he said.
So far, however, AWS and Annapurna have barely dented Nvidia’s dominance in AI infrastructure.
Nvidia logged $26.3 billion in revenue for AI data center chip sales in its second fiscal quarter of 2024. That figure is the same as Amazon announced for its entire AWS division in its own second fiscal quarter—only a relatively small fraction of which can be attributed to customers running AI workloads on Annapurna’s infrastructure, according to Hutcheson.
As for the raw performance of AWS chips compared with Nvidia’s, Amazon avoids making direct comparisons and does not submit its chips for independent performance benchmarks.
“Benchmarks are good for that initial: ‘hey, should I even consider this chip,’” said Patrick Moorhead, a chip consultant at Moor Insights & Strategy, but the real test is when they are put “in multiple racks put together as a fleet.”
Moorhead said he is confident Amazon’s claims of a 4-times performance increase between Trainium 1 and Trainium 2 are accurate, having scrutinized the company for years. But the performance figures may matter less than simply offering customers more choice.
“People appreciate all of the innovation that Nvidia brought, but nobody is comfortable with Nvidia having 90 percent market share,” he added. “This can’t last for long.”
© 2024 The Financial Times Ltd. All rights reserved. Not to be redistributed, copied, or modified in any way.
DiscoverGold
2 weeks ago
Amazon (AMZN) Stock Eyes Record Highs After Earnings
By: Schaeffer's Investment Research | November 1, 2024
• Amazon.com reported a third-quarter revenue and earnings beat
• AMZN price-target hikes and call options are flying off the shelves today
Shares Big Tech staple Amazon.com Inc (NASDAQ:AMZN) are surging today, last seen 7.2% higher to trade at $199.90. The e-commerce company's strong third-quarter results included $1.43 earnings per share on revenue of $158.88 billion, both of which beat Wall Street's expectations thanks to growth in its cloud computing and advertising units. In addition, Amazon Web Services' revenue grew 19% over the last 12 months, slightly below estimates but still faster on a year-over-year basis.
No less than 18 analysts have hiked their price target in response. Truist Securities appears to be the most bullish, moving from $265 up to $270 -- a more than 35% premium to the equity's current perch.
AMZN is a chip shot from its July 8 all-time high of $201.20, though the round-number $200 level could turn out to be a hesitation point. The shares are now back above the $2 trillion market cap level for the first time since July. Year-to-date, the stock is now up 31.1% and 43.1% year-over-year.
AMZN options are flying off the shelves. In the first half hour of trading, 245,000 calls and 128,000 puts have crossed the tape, volume that's eight times the intraday average amount. The weekly 11/1 200-strike call is the most popular by a wide margin, with the 205-strike call from the same series following distantly.
Read Full Story »»»
DiscoverGold
DiscoverGold
3 weeks ago
Amazon Earnings Driver Is AWS
By: 24/7 Wall St. | October 28, 2024
• The key to Amazon.com Inc. (NASDAQ: AMZN) earnings will be AWS.
• It has adopted AI software and has to show that this will add to Amazon’s earnings.
Amazon.com Inc. (NASDAQ: AMZN) has been, and still is, the largest e-commerce company in America for decades. It undercut prices and the delivery services of physical retailers so much that it was blamed for losses in the brick-and-mortar industry. No one would have guessed that Amazon Web Services, now known as AWS, would transform the massive online retailer and that e-commerce would take a back seat to cloud computing. But that is what has happened.
AWS is the largest cloud computing company in America based on revenue. The sector has become a horse race recently, driven by which players can adopt artificial intelligence to draw customers and market share. Amazon investors have been concerned that Microsoft Corp.’s (NASDAQ: MSFT) relationship with AI leader OpenAI would give it an early lead. However, it is far too early to tell. AWS CEO Matt Garman recently told The Wall Street Journal that his company had been more “deliberate” as it rolled out AI features and was not trailing Microsoft’s services.
Amazon’s upcoming earnings will show whether its approach is correct. When Amazon announced earnings for the second quarter, CEO Andy Jassy offered several descriptions of why AWS was the best AI-based solution for most cloud customers.
Upcoming earnings will be early proof of whether Jassy is right. In the second quarter, AWS had revenue of $26.3 billion, 18% of Amazon’s total. However, its operating income was $9.3 billion, compared with Amazon’s $14.5 billion total, which made it 64% of the bottom line. AWS revenue growth was 19% from the same period in 2023. Presumably, if AI features have drawn customers recently, the growth rate will rise substantially.
When Amazon posts third-quarter earnings, its original e-commerce business will not matter much if AWS does not post home run numbers.
Read Full Story »»»
DiscoverGold
DiscoverGold
1 month ago
2 Big Tech Stocks Dinged by Downgrades
By: Schaeffer's Investment Research | October 7, 2024
• Both AAPL and AMZN were downgraded this morning
• Both AMZN and AAPL have seen high options volume over the last two weeks
Tech giants Apple Inc (NASDAQ:AAPL) and Amazon.com Inc (NASDAQ:AMZN) are both moving lower today, following this morning's bear notes.
Jefferies downgraded AAPL to "hold" from "buy," noting near-term expectations for the company's iPhone 16 and iPhone 17, both marked by artificial intelligence (AI) software, were overblown after weaker-than-anticipated initial demand. At last glance, Apple stock was down 0.9% to trade at $224.76.
Wells Fargo slashed its rating on AMZN to "equal weight" from "overweight," citing slowing growth and heightened competition from Walmart (WMT), warning cloud strength was "not enough." At last look, Amazon.com stock was down 2.8% to trade at $181.29.
Both stocks are no stranger to Schaeffer's Senior Quantitative Analyst Rocky White's list of stocks that have attracted the highest options volume in the last 10 days, and this most recent period is no different. Over the past two weeks, both equities have seen a wealth of bullish-leaning trading, as seen in the chart below.
Read Full Story »»»
DiscoverGold
tw0122
2 months ago
"It Changes The Math": Biden's Tariff Crackdown Throws Amazon And Walmart's Sneaky China Plans Into Chaos
Under mounting pressure from Chinese retail giants like Shein and Temu, American behemoths Amazon and Walmart have been cooking up a scheme to dodge tariffs and slash costs - but a new move by the Biden administration might just rain on their parade.
For months, these U.S. retailers have been quietly plotting to overhaul their business models, aiming to ship more goods directly from Chinese factories straight to your doorstep. By doing so, they'd cut out pricey U.S. warehouses and stores, all while skirting hefty tariffs using a little-known loophole in a century-old trade law.
Amazon has been preparing a new discount service that would ship products directly to consumers, allowing those goods to bypass tariffs.Credit...Octavio Jones for The New York Times
This loophole, known as "de minimis," lets importers bypass U.S. taxes and tariffs on shipments valued under $800. The result? Chinese platforms like Shein and Temu have been flooding the market with dirt-cheap products, leaving American companies scrambling to keep up.
But on Friday, the Biden administration threw a wrench in the works. In a surprise announcement, officials declared plans to slam the door on many Chinese imports exploiting the de minimis rule—especially clothing items. The crackdown aims to curb the tsunami of duty-free packages pouring into the country, predominantly from China.
While the changes won't happen overnight - the proposal will undergo industry scrutiny before finalization - the message is clear: The free ride is coming to an end, the NY Times reports.
Amazon had been gearing up to launch a discount service capitalizing on direct-to-consumer shipments from China, insiders revealed. Walmart, even if reluctant to shake up its model, felt the heat to consider similar tactics to stay competitive.
"It's get on board or get left behind," said Steve Story, executive vice president for customs and international trade at Apex Logistics International. "If you don't get online and embrace this, you're going to be overshadowed by Shein, Temu, and Alibaba."
Story admitted he's assisted Chinese sellers in dodging tariffs by shipping through Amazon's fulfillment centers, thanks to a 2020 customs ruling allowing Chinese firms to act as "non-resident importers." Essentially, they can send products tariff-free to themselves via Amazon warehouses scattered across the U.S.
Traditionally, retailers hauled shipping containers loaded with goods from China to U.S. ports, then trucked them to warehouses and stores before reaching consumers. Now, many are bypassing this route, opting to individually package and ship items directly from China under the de minimis rule. This method not only avoids tariffs but also skirts the need for extensive warehousing.
The numbers are staggering. Packages entering the U.S. under the de minimis rule have skyrocketed to over one billion in 2023, up from a mere 140 million a decade ago. China is the chief contributor, sending more packages than all other countries combined.
American businesses are pissed - saying that the rules create an uneven playing field since brands with U.S. stores and warehouses are subject to more in tariffs compared to those shipping directly to consumers.
"De minimis is like a big tax incentive the U.S. is giving you to take the job somewhere else," lamented Peter Bragdon, general counsel at Columbia Sportswear. "It changes the math."
Mike Hesse, CEO of Nebraska-based Blue Ox, which manufactures tow bars for RVs, discovered Chinese knockoffs of his products being sold on Amazon and slipping into the country via de minimis.
"They're a safety issue, plus consumers are duped into thinking they're buying an American-made product," Hesse fumed. "That's how de minimis is affecting me."
Traditionally, to bring goods into the country, retailers would arrange for a shipping container of products to be brought into U.S. ports from overseas.Credit...Kristen Zeis for The New York Times
Some companies have accused Chinese firms of dirty tricks, like falsifying invoices to sneak pricier items under the $800 threshold or faking shipping documents to send bulk goods duty-free.
Meanwhile, some retailers have shifted warehouses to Canada or Mexico. From there, they can swiftly and legally ship items duty-free into the U.S. when orders roll in - taking more American jobs with them.
As the Biden administration tightens the screws on de minimis shipments from China, fears are mounting that imports from our neighbors might surge as companies look for new loopholes.
Even Chinese giants are bracing for impact. Shein says it's open to reforming the tariff exemption and will adapt to keep customers happy. Temu has started highlighting products from "local warehouses," a move seen as hedging against regulatory changes.
"This is clearly a strategic move to limit exposure to any regulatory shifts," noted Juozas Kaziukenas, founder of e-commerce intelligence firm Marketplace Pulse.
All eyes are now on Washington. While lawmakers on both sides of the aisle have floated proposals to narrow the de minimis exemption, it's uncertain whether they'll rally behind a unified plan.
"The reason we really would like to see certainty is so everybody can plan business accordingly," said Donald Tang, executive chairman of Shein, just a day before the administration's bombshell announcement. "If everything is hanging in the middle... it's not good for the business planning process."
For Amazon, Walmart, and countless others, the race is on to adapt—or risk being left in the dust.