![]() A 15/16" Intel MBP running the latest macOS 11 Big Sur, or a M1 MBP (which comes with Big Sur as default).(You can also apparently get things working with PlaidML, but I’ve not tried that) Step 0: Check pre-requesits ![]() This is how I got things working on my Intel-based 16" 2019 MBP, but it should also work on the new M1 MBPs. But that hasn’t stopped Apple releasing a Mac-optimised TensorFlow, and more excitingly, a platform-optimised graphics API called Metal with a specific TensorFlow plugin. The official party line from Tensorflow and from NVIDIA is no. ![]() After my raging success at getting a GPU-accelerated deep learning stack installed and running on Ubuntu, I wondered - could it be done on my MacBook? Third generation 15" MacBook Pros should be capable of running deep-learning on their NVIDIA GPU’s, the fourth generation 16" have relatively beefy AMD cards that run OpenCL, and much was made of using the new M1 graphics cores and “ neural engine” for rapid deep-learning, so surely it should be doable? ![]()
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |