Apple vs. Nvidia. : What happened? (2024)

If you've been using macOS for a while, you might remember a time when Apple had both GPU options from ATI (purchased later by AMD) and Nvidia. In fact, the Macintosh was the first platform to sport the GeForce 3 in 2001. Nvidia even made a special chipset that was found in the 2008 MacBooks that helped deliver better GPU performance and skipped the Intel integrated chipsets. </p

Then suddenly, Apple stopped using NVidia chipsets. The last Macs featuring an Nvidia GPU was in 2015.

The video version differs slightly as it includes more personal ancedotes and asides.


Appleinsider isn't my favorite source for Apple news as it's too evangelical, generally portraying Apple as the protagonist in its reporting. Still, I have to give them credit as they've followed the Apple/Nvidia saga better than any other publication. It's Apple's management doesn't want Nvidia support in macOS, and that's a bad sign for the Mac Pro is a great first stop, but it's a bit dated and self-referential. I've tried to piece together the narrative as told by many news reports over the years, much of it I read as it was happening. It's a particular topic that interests me as it dates back to when I bought my first Nvidia GPU in 2001, a VisionTek GeForce 3, and used DOS with nvflash.exe to load the Mac Firmware onto the GPU. It was a crazy leap of faith as I read some guy who claimed to have done it on XLR8yourmac.com (once a powerhouse of a website for power users) and then reported back the steps I used to flash the card to the community. Over the years wrote a few popular guides on using Nvidia GPUs on the Mac and wrote a lot about Mac GPUs as part of my monstrous The Definitive Classic Mac Pro (2006-2012) Upgrade Guide. I don't have any particular insider info, but what I do have is the power of hind-sight.

The history of Nvidia and Apple

The first Mac to ship with an Nvidia chipset was the Nvidia GeForce 2 MX, with the G4 Digital Audio in 2001, and Apple would also at the same time ship the PowerMacs with an option GeForce 3 GPU.

In 2004, 30-Inch Apple's Cinema Display release was delayed by Nvidia's GeForce 8600 Ultra yields, not producing the cards in a timely enough fashion for Apple's liking. Still, Apple continued to offer plenty of Nvidia options. As important as Apple was during this time frame, it wasn't the goliath it is today.

The year 2008 is when the relationship with NVidia changed during a flurry of events. Apple pulled into a legal battle that was primarily between Nvidia and Intel. To understand this, we have to jump back to 2004.

In 2004, Intel and NVIDIA joined forces for a patent licensing agreement for Intel CPUs with integrated memory controllers, the MCP79 and the MCP89. Then in 2008, Nvidia produced Nehalem-based chipsets that bypassed the Intel Northbridge (Memory controller) and South Bridge (I/0 controller) chipset. Apple was the first PC maker to adopt Nvidia's new chipset. The advantage was that Apple was going to be able to simplify its GPU strategy. It'd allowed Apple to stop using the underwhelming Intel integrated GPUs and unify them to mirror the desktops. At the time, Intel's integrated GPUs were pretty bad and could not support OpenCL, thus limiting the amount of offloading to the GPU that Apple could reliably bank with the OS.

Intel was much more central to Apple as a business partner, and Intel enjoyed Apple in its company roster. Nvidia pulling a fast one on Intel put Apple in the center of its own controversial strategy.

Predictably, Intel then filed suit against Nvidia, throwing Apple's plans into disarray. Neither company was endeared to Apple, as the squabble had many industry people speculating that Apple may look into AMD processors, even though AMD had very few competitive offerings in the laptop space. Nvidia tried to court Apple into its legal saga but ultimately failed, leaving Nvidia feeling spurned. Apple continued to use Nvidia GPUs, but sadly, its lower-end offerings were constrained to Intel's supremely mediocre integrated GPUs. This wasn't the only issue Apple was having with its relationship with Nvidia.

Meanwhile, in 2008 Nvidia was hit with a securities lawsuit around knowingly shipping faulty GPUs and trying to mitigate the problem through firmware, burning $196 million for replacements. HP at the time said it had 24 models of laptops affected, and Dell had 15. Apple had 2, the MacBook Pro using the GeForce 8600M GT.

GPUs were failing at a steady clip (not just for Apple), and Apple had to extend its warranties for consumers in 2009 (ending in 2012) and issued a software update in 2009 trying to mitigate the GPU issues. The problem came down to the soldering that held the a chip it's printed circuit board cracked under thermal stress. This still landed Apple in a class action lawsuit. Nvidia saw Apple as a smaller player and refused to extend support costs beyond an unknown amount of money (it only handed out $10,000,000 to Dell after it threatend to pull from Nvidia), putting another twist in the Apple relationship. This was the dividing moment by most accounts.

By this point, multiple publications reported a frosty air between Nvidia and Apple, although the high-end MacBook Pros would continue to use Nvidia GPUs.

Tried to use an AMD chipset in the MacBook Pros in 2011 and ended up in yet another class action lawsuit over faulty GPUs. Apple would switch back to Nvidia in 2012 MacBook Pros.

2013 marked a substantial shift away from Nvidia. Apple went with long-time Nvidia rival AMD for its partnership to produce custom variations of the Radeon FirePros for the 2013 Mac Pros. The iMac 2014s moved to AMD with the introduction of the 5k iMac.

If there was any hope of Nvidia and Apple reconciling, 2014 was the end of it. Nvidia went litigious against Samsung and Qualcomm over mobile graphics patents, filing a lawsuit over mobile GPUs. They went as far as to try and block shipments of Samsung Galaxy S / Note /Tab lines, with speculation that Nvidia wanted the iOS and Android business. At this time, Apple was still relying on components from Qualcomm and Samsung for its mobile units.

Things seemed quiet. Nvidia had ported CUDA to macOS and created Web Drivers even while Nvidia still was producing GPUs for Apple as their relationship fizzled.

Apple had embraced OpenCL, the popular framework used for GPU accelerated computing tasks. Nvidia had created its own closed alternative, CUDA, and using its marketing power to court various software publishers to use it over OpenCL. NVidia's CUDA did not work on AMD hardware, thus giving Nvidia a competitive advantage if a software maker chose to use CUDA. Adobe embraced CUDA even on macOS and thus earned CUDA a favored position among creative professionals, especially those using the Adobe Suite. Adobe went as far as to build CUDA specific applications for Nvidia GPUs. In the background, Apple was poaching industry talent for it's own GPU ambitions.

Nvidia continued a quiet strategy for macOS by bringing support for its later GPUs on macOS and updating CUDA. This meant classic Mac Pro owners, eGPU users, and Hackintosh users could enjoy the latest Nvidia hardware under macOS, which continued uninterrupted for nearly seven years. Many Mac professionals invested in Nvidia hardware as AMD's offerings generally paled against Nvidia at the higher end, and CUDA offered a lot more performance in Adobe video applications like Premiere Pro and After Effects. Nvidia didn't overtly flaunt its web drivers, and it came as a surprise to many Mac users to learn that they could buy Nvidia GPUs and use them in their Mac Pros. As a personal anecdote, I wrote two popular guides on using a GeForce 700s series and GeForce 1000 series GPU in a Mac Pro.

With the release of macOS 10.14 Mojave, everything changed. Outside of the people on Infinite Loop, no one knew for sure that Apple's grand ambition was to merge macOS and iOS hardware. Most users at the time feared the iOSfication of Apple's software instead of hardware.

For years, Microsoft had a huge leg up in the graphics department by owning its own graphics API in the form of DirectX. OpenGL, Apple's preferred graphics API, had floundered in the late 2000s, whereas DirectX, for all its faults, leaped ahead of OpenGL in graphics capabilities and support.

Rather than wait for the next open-source library, Vulkan, to formalize, Apple developed its own graphics API, Metal, for use with iOS. Microsoft most certainly inspired Metal. Bringing Metal to macOS was all-but given and was ported to macOS in 2019, set to replace both OpenGL and OpenCL and skip Vulkan support.

macOS 10.14 Mojave required metal-compatible GPUs. At some point, during the macOS Mojave beta, Apple pulled Nvidia's ability to sign its code, which ended Nvidia's support for macOS in one spiteful, anti-competitive move. In order for GPUs to be Metal compatible, they needed drivers, and Nvidia wasn't able to release drivers.

Nvidia publicly announced that it had working metal drivers on its forum, but Apple had revoked its developer license leaving the blame squarely at Apple's feet. Nvidia even called out apple on his support page but has now since modified it.

My personal take is that it boiled down to CUDA, Metal, and the M1. CUDA represented a significant problem for Metal adoption. In order to get professional applications on board with Metal, they had to cut out CUDA, and my guess is that NVIDIA was not willing to give up CUDA in its driver. Yet again, this was the impasse between Apple's management and Nvidia.

In order for Apple to launch Apple Silicon very smoothly, they needed everyone to support Apple's current technologies, and CUDA was a roadblock to that success.

Apple also knew the aftermarket install base for NVidia GPUs was quite small and limited to classic Mac Pro users and adventurous people who had eGPUs and the Hackintosh community. The group of people this affected was a group Apple the past decade has seemed vaguely resentful of: users who like modular computing. Axing Nvidia was another blow against modularity and another win for Apple's tight-fisted control of when products are obsolete.

The goalposts have now changed. The question isn't whether Nvidia and Apple will get along. It is now whether Apple will allow external GPUs or dedicated GPUs. At the time of writing this, this MacRumors on its buyer's guide page lists that it thinks that apple will release GPUs that outstrip AMD and Nvidia's current offerings.

Usually, MacRumors is pretty on point. Still, I'm just hyper skeptical the year-over-year gains in the GPU market have been not just consistent but going up also. Nvidia and AMD are two of TSMC's biggest clients. They, too, will have access to the same manufacturing processes as Apple. They've been doing it much longer, and they're very good at it.

I have a very unusual take on this whole thing, and that is that in the future, we're going to see macs that absolutely rock at laptop performances and low wattage.

Also, we'll probably see iMacs in a year or two that can edit 8k natively but also can't ray trace and are pretty crap when it comes to things like TensorFlow.

To quote myself after I received my first Apple Silicon mac in December of 2020: "for the portable class of computing, Apple silicon looks like it'll be unmatched, and expensive brute force versus efficiency will be the story of x86 versus Apple Silicon versus ARM, and I expect there will always be a clear winner. Welcome to the next decade of computing."

Apple vs. Nvidia. : What happened? (2024)

FAQs

Why did Apple stop working with Nvidia? ›

In order for GPUs to be Metal compatible, they needed drivers, and Nvidia wasn't able to release drivers. Nvidia publicly announced that it had working metal drivers on its forum, but Apple had revoked its developer license leaving the blame squarely at Apple's feet.

Will Nvidia overtake Apple? ›

We believe Nvidia can surpass Apple by capitalizing on the artificial intelligence economy, which will add an estimated $15 trillion to GDP.”

Does Nvidia partner with Apple? ›

Nvidia is bringing its Omniverse Cloud platform to Apple's headset, allowing users to interact with objects and design directly through the Vision Pro. The basis of support is a set of Omniverse Cloud APIs that can stream Omniverse assets to Apple's headset.

What company stopped working with Nvidia? ›

EVGA is reportedly making the decision to no longer work with Nvidia because it feels the company was a bad partner, according to both Gamers Nexus and JayzTwoCents.

Who is NVIDIA's biggest rival? ›

Huawei developed the Ascend series of chips as a rival to Nvidia's line of AI chips. The Chinese company's main product, the 910B chip, is its main rival to Nvidia's A100 chip, which launched roughly three years ago. Analysts have estimated China's AI chip market to be worth $7 billion.

Can Apple silicon beat NVIDIA? ›

Results show that the M1 Pro chip doesn't quite meet the Nvidia GPU's performance, taking 216 seconds to process the audio compared to the 4090's 186 seconds. However, newer Apple chips have much better performance.

Does Nvidia have a future? ›

High-end chip maker Nvidia (NVDA -3.33%) has smashed expectations year over year, with annualized revenue growth of 60% since 2021. With yearly releases of improved graphics processing units (GPUs) for both gaming and cloud computing, the company has been able to continuously offer the best GPU hardware on the market.

Is Nvidia bigger than Apple? ›

As of this writing, Nvidia has a market cap of $2.2 trillion, making it the third-largest American company, behind only Microsoft and Apple. That puts it 'only' $500 billion behind Apple and about $1 trillion behind Microsoft.

Where will Nvidia be in 5 years? ›

So, Nvidia's revenue is on track to increase 5 times in a space of five years considering its fiscal 2024 forecast, translating into a compound annual growth rate (CAGR) of 38%. A similar CAGR over the next five years would take Nvidia's annual revenue to a whopping $295 billion in fiscal 2029.

Does Pixar use NVIDIA? ›

NVIDIA is also working with an ecosystem of partners through the Alliance for OpenUSD (AOUSD)—including Pixar, Adobe, Apple, and Autodesk—to evolve USD as it becomes one of the building blocks in the era of AI and industrial digitalization.

Do Macs use NVIDIA? ›

MacBook Pro 16-inch laptops with 3,456×2,234 ProMotion 120Hz refresh-rate displays enable gaming in 4K high dynamic range at up to 120 fps. With NVIDIA DLSS 3 technology, these Macs can even run graphically intense games like The Witcher 3 and Warhammer 40,000: Darktide at 4K 120 fps.

Is NVIDIA partnered with Amazon? ›

AWS and NVIDIA Extend Collaboration to Advance Generative AI Innovation. GTC—Amazon Web Services (AWS), an Amazon.com company (NASDAQ: AMZN), and NVIDIA (NASDAQ: NVDA) today announced that the new NVIDIA Blackwell GPU platform — unveiled by NVIDIA at GTC 2024 — is coming to AWS.

Is Nvidia a China company? ›

Nvidia is an American technology company that designs, manufactures, and sells semiconductor chips and graphics processors, along with other software.

Is Nvidia American owned? ›

NVIDIA Corporation (NVDA) is an American semiconductor company and a leading global manufacturer of high-end graphics processing units (GPUs). Based in Santa Clara, California, NVIDIA holds approximately 80% of the global market share in GPU semiconductor chips as of 2023.

Who is Nvidia owned by? ›

Nvidia (NVDA) Ownership Overview

The ownership structure of Nvidia (NVDA) stock is a mix of institutional, retail and individual investors. Approximately 39.13% of the company's stock is owned by Institutional Investors, 3.98% is owned by Insiders and 56.89% is owned by Public Companies and Individual Investors.

Why did Apple switch from NVIDIA to AMD? ›

Apple's continued use of AMD graphics comes down to power consumption. The NVIDIA GTX 1060, which is an equivalent card to the 13-inch MacBook Pro's Radeon Pro 560X, uses about 37 watts more than the Radeon Pro 560X and a whopping 75 watts more than the stripped-down 13-inch MacBook.

Does Apple support NVIDIA GPUs? ›

Apple used to work closely with NVIDIA, until 2008. In 2007 and 2008, Apple made Macbook Pro models with the ill-fated NVIDIA Geforce 8400M and 9400M. These Geforce chips had a manufacturing defect in the chip die that caused the GPUs to fail prematurely. NVIDIA took a $200,000,000 hit on the faulty chips .

Why doesn t Apple make GPUs? ›

Apple doesn't want to supply an industry with components, it wants to supply an industry with an entire ecosystem. Apple's approach to technology is to provide the whole setup. it will! soon or later, GPUs will be a vintage hardware.

Why Apple is not suitable for gaming? ›

It's true, MacBooks aren't the best for gaming

Historically, MacBooks have been built with underpowered graphics cards, usually made by AMD or Intel for graphic design apps instead of games. These GPUs couldn't run advanced 3D games, and were far outclassed by the kinds of GPUs you'd find in PCs for the same price.

Top Articles
Latest Posts
Article information

Author: Merrill Bechtelar CPA

Last Updated:

Views: 5942

Rating: 5 / 5 (70 voted)

Reviews: 93% of readers found this page helpful

Author information

Name: Merrill Bechtelar CPA

Birthday: 1996-05-19

Address: Apt. 114 873 White Lodge, Libbyfurt, CA 93006

Phone: +5983010455207

Job: Legacy Representative

Hobby: Blacksmithing, Urban exploration, Sudoku, Slacklining, Creative writing, Community, Letterboxing

Introduction: My name is Merrill Bechtelar CPA, I am a clean, agreeable, glorious, magnificent, witty, enchanting, comfortable person who loves writing and wants to share my knowledge and understanding with you.