Multi GPU support
#13140
Replies: 2 comments 3 replies
-
Yes, this is the intended behavior, some ask for multi-GPU support for faster ML, but no work has been done or planned. Though even with multi-GPU support I'm not sure 4+4 = 8 for the facial grouping usecase? |
Beta Was this translation helpful? Give feedback.
1 reply
-
If you want to use multiple GPUs, you'll need to spin up an ML container for each and a loadbalancer to spread requests between them. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am using version 1.117.0 with buffalo_l model for facial recognition use case. My system is having 2 vGPU (Nvidia A40) each having 4G VRAM. I am getting out of memory exception in machine learning container for GPU:0. As my vm is having 2 gpu but model always uses GPU:0. is this is the intended behaviour ? or is it possible to use multiple gpu for facial recognition use case.
Beta Was this translation helpful? Give feedback.
All reactions