![nvidia geforce gt 930m tensorflow nvidia geforce gt 930m tensorflow](https://www.techpowerup.com/gpu-specs/images/c/2644-front.small.jpg)
You do not need to worry about not being able to use some software or not being able to train some networks because of that though.
![nvidia geforce gt 930m tensorflow nvidia geforce gt 930m tensorflow](https://imgs.developpaper.com/imgs/3734645365-58914342df599_articlex.png)
It is not a bad GPU to train on, mainly if you are just beginning with Deep Learning, but it is not the best either. We are expecting the GPU to be marginally faster than its predecessor since the Maxwell generation GPUs mostly improve power. It was released in March 2015 and it is based on last year’s GeForce 840M with GM108 GPU inside. You will still be able to use them, but not with your GPU, only with CPU. The NVIDIA GeForce 930M is a lower-mid-range GPU that’s included in budget notebooks or multimedia-oriented ones. Plus, it intelligently conserves battery life with NVIDIA Optimus technology.
![nvidia geforce gt 930m tensorflow nvidia geforce gt 930m tensorflow](https://notebooks-und-mobiles.de/wp-content/uploads/2018/04/Rating_Nvidia_Geforce_930MX-595x413.jpg)
This also means that you won't be able to use TensorFlow or PyTorch because they require >= 3.0 Compute Capability (thanks to janneb for pointing that out). GeForce 930M delivers up to 3.5X faster graphics performances and accelerates photo and video editing applications. The GPU you mention, NVIDIA GeForce GT 653M, is CUDA enabled, but has the Compute Capability of only 2.1 which is on the low end. Another is the CUDA capability of the given GPU, which you have mentioned in your question. One would clearly be just the performance aspect - the more powerful the GPU, the quicker the training. This is mostly true for using TensorFlow and PyTorch as well, but there are caveats as discussed later.Īs for their suitability for the task, it is true that some GPUs are better than others, for various reasons. Any GPU can be used for Deep Learning training.