

Then in 2008, Apple's MacBook Pro shipped with Nvidia graphics chips that revolutionized the MacBook by taking over the functions of the Northbridge and Southbridge controllers alongside actual graphics rendering. In 2004, the Apple Cinema Display was delayed, reportedly because of Nvidia's inability to produce the required graphics card, the GeForce 6800 Ultra DDL. The first Mac to include an Nvidia graphics processor was released in 2001, but Apple was still using chips made by ATI, the company that was eventually bought out by AMD in 2008. So why is there no support for Nvidia drivers? What caused this and what can you do about it? We'll tell you what you can do in just a minute, but let's go back in time and see how Apple and Nvidia's relationship fell apart. And with Nvidia drivers not seeing support in macOS Mojave, those who already own Nvidia cards are out of luck. However, its diminutive size means it doesn't house a dedicated graphics card, so for those who need the graphics performance, they have to resort to an eGPU. For example, the 2018 Mac Mini has serious performance potential with a 6-core i7 processor that outperforms even the best CPU in the 2018 MacBook Pro. In the past couple of years, external GPUs have been on the rise, helping Macs with otherwise low graphics performance get a boost for things like video rendering and gaming. MacOS Mojave dropped support for new Nvidia graphics drivers, except for a couple of chipsets included in still-supported Apple laptops - all of them outdated. The beauty of the modular Mac Pro up until 2012 is that you were able to swap out the graphics cards to keep the Mac Pro up-to-date with the latest graphics rendering technology and performance, but those who opted for Nvidia cards are stuck with old macOS software, and that can be infuriating.
