800 212 3434
4.707,35 €
Marca | PNY |
---|---|
Produttore | NVIDIA |
Serie | NVIDIA RTX A6000 |
Dimensioni prodotto | 38,1 x 8,38 x 24,13 cm, 1,18 kg |
Numero modello articolo | VCNRTXA6000-PB |
Coprocessore grafico | NVIDIA Quadro RTX A6000 |
Grafica Chipset Brand | NVIDIA |
Descrizione scheda grafica | Dedicato |
Tipo memoria scheda grafica | GDDR6 |
Dimensioni memoria scheda grafica | 48 GB |
Interfaccia scheda grafica | PCI Express |
Le batterie sono incluse | No |
Peso articolo | 1,18 Kilograms |
Aggiornamenti software garantiti fino a | sconosciuto |
At sem a enim eu vulputate nullam convallis Iaculis vitae odio faucibus adipiscing urna.
William Daugherty –
Good service
JJG –
Now that I experienced heat out the back, I see how perfect of a solution this is. Why are not all video cards like this?
Never can have enough memory for deep learning. Don’t believe any analysis that says you need less memory.
Chaqui Bisquet –
This card if for 3D design, doing 3d renders, animation, any GPU intensive task. I had just built a new system and the graphics card shortage hit. This card was out of my price range, but out of frustration, as nothing was available at or near MSRP, I bought it anyway. While it was way too expensive for my budget, it was being sold at close to MSRP. I so wanted to feel the guilt of buyers remorse, but I couldn’t. The card is amazing, it cut my render times to a fraction of my previous times, the 48 GB of VRAM can handle any complex scene I throw at it, it doesn’t make my computer so hot I can fry and egg on it, and best of all it doesn’t scream like a lost soul condemned to the Inferno when it’s working (the screaming, I couldn’t stand the screaming.)
garrettg84 –
I’ve been using two of these cards for AI/ML research. The are surprisingly quiet compared to what I expected. They are significantly quieter than their gaming counterparts. Temps seem to creep up to 85degrees and stabilize right there while running longer tasks. They stay right at their 300w power envelope as well. For the work I’m doing, I’m not sure if the NVLINK was worth it. I’ve tried some of the models I’m working with both with and without the NVLINK bridge in place and there’s no discernable difference. There may be some edge cases I get to in the future, but for inference on GPT models and toying with stable diffusion, I’ve not seen a difference.
Peter –
Bought this one for deep learning, and it freaking delivers results!!!