Previous 1 2 3 Next 33 Replies Latest reply: Apr 14, 2015 12:48 PM by Neil Paisnel Go to original post Branched to a new discussion.
  • nextech Level 1 (35 points)

    Steven 1994,


    > In fact, I'm not a gamer at all, it is not for Heavy Gaming.


    Throw a second Radeon HD 5870 into the computer, just make sure it has the EFI bios flash, and get a $6 power adapter from Fry's and quit worrying about it.  It's easy, and it's simple.


    > - as Apple Mini DisplayPort to Dual-Link DVI Adapter already cost 99$,

    > is it not worth to buy another ATI 5870 in order to able to connect more screens in future?


    In my opinion, yes I would just get a second graphics card.  Especially if all you're doing is just powering extra displays (and you're really not even using the GPU on the 2nd graphics card and all it's doing is sitting idle).  So yes, get a 2nd graphics card, spend $6 on the power cable, and just hook it up as I stated (originally).


    For trading, and just doing standard 2D work, it's perfect and it's fine.

  • nextech Level 1 (35 points)



    If you want to get an idea of where you are at EXACTLY as far as power draw, just get a $19 P4400 Kill-a-watt and just plug it into the wall, and then plug your Mac Pro directly into the P4400 Kill-a-watt device, and you can actually see EXACTLY what your electrical draw is.


    If you are truly getting anywhere clost to 900+ watts or getting close to 1000+watts draw, then yes I would be a little tiny bit worried, and would look into getting a second power supply but the Apple Mac Pro has a decent/stable power supply in it, and it seems fairly rock solid.  Almost all of my Mac Pro's have one single power supply in them.  Only two of them have a secondary power supply, and that is only because we use those two machines specifically for 3D rendering.  (And yes those machines get quite hot, and we have dedicated 20Amp circuits for each machine, and have them in a separate room/server room which is properly cooled, and we keep the room down around 50 degrees when we are doing heavy rendering).  As for what you are doing, which is mostly just 2D desktop work, that second GPU will probably be sitting idle almost all of the time.


    I'm guessing that you probably won't even be over 750-800 watts total power draw on your system.


    Adding a second power supply is "overkill" in my opinion.  Unless you are doing heavy gaming (which you already stated that you are not).


    If your power draw is getting relatively high (and you're extremely worried) then just throw in a $24 Juice Box (450 watt power supply).  It will mount in your 2nd optical drive bay, and just use that $24 Juice Box to directly power your 2nd graphics card, but I wouldn't think it would be necessary unless you're doing heavy 3D rendering and are somehow using BOTH of your GPU's at 100% utilization.


    Take a peek at your GPU utilization, as you're trading, but mine is fairly low.


    Also if you're worried about power draw, just get a $19 Kill-a-watt, plug it into the wall and test your system BEFORE you add the second graphics card and then test your computer AFTER you add the second graphics card, and you'll see that there isn't really all that much difference (maybe 100-140 watts?)


    I have used Radeon HD 2600XT's simply because they are an older card, but use far less electricity (18 watts).  If you just need to power two extra monitors.


    Here is a Kill-a-watt device here:


    This way you can tell exactly what you are using in terms of watts/amps.  If your computer is sitting idle, you are not going to draw that much power.  I guarantee that second graphics card will be idle the majority of the time (unless you're doing 3D rendering).

  • The hatter Level 9 (60,925 points)

    You finally come down to what I said to begin with:


    If your power draw is getting relatively high (and you're extremely worried) then just throw in a $24 Juice Box (450 watt power supply).


    You keep mentioning rubbish 2600XTs which no one is or should be using.


    People running high end cards are usually the concern. Dual GTX 680 or any card combination that needs more than 2 x 6-pin (150W total) add'l power.


    As for 4-pin molex - EVGA provides but does not support using them with even their cards. If someone meaures them directly I am surprised that they woudl use something used to drive a couple optical drives or disk drives with that much.


    For all the extensive discussion, it seems to confuse readers, not help, and misleads. Makes it sound like it is "okay" to use spitter and endorse going beyond the power design. As well as "well I'm only doing such and such and don't need to worry about the power draw of ____" excpt people are looking to actually use graphic card to two that exceeds the design and power provided.


    Putting a $599 GTX 680 at risk is not good advice (eVGA's being 2 x 6-pin so there isn't enough to power a 2nd w/o 450W PSU. Or drive a ATI 5x70.


    (and any combinaton of two nVidia cards are not working in OS X, so using GT120s is only going to trigger kernel panics and problems).

  • avner0x Level 1 (0 points)



    I know this is an old thread but I was wondering if I can ask you for some help.


    I have a 2009 Mac pro and I have Dual ATi Radeon 7950s the Mac Pro sapphire edition kind.

    I would really like to have both cards in my mac pro.

    At first I bought two 6 pin adapters so that I could power both directly off the motherboard but the cards started artifacting


    so I knew they were not getting enough power.

    I read on a  forum that the Saphire 7950s can draw up to 386 watts each and the mac pros power supply is 1000 watts.


    I have currently in the mac pro


    W3690 cpu


    Apricorn velocity duo SSD holder with x2 250gb Samsung PM851s


    caldigit usb 3.0 card


    x2 super drives


    currently one 7950 saphire mac edition


    32gb ram


    All of this I am sure uses considerable power so is there really enough left over to run two 7950s?


    Since I have a 2009 mac I don't have any standard molex connectors in the dvd bay like the 2008 so I was


    wondering if there are still any adapters that could work for me?


    Thank you in advance for your help!

  • Grant Bennet-Alder Level 9 (55,350 points)

    "Regular folks" who do not want to mess around with adapter cables for power can run two 5770s (each 5770 requires one of the two aux power connectors). That can get you six displays.


    Only the 2008 and earlier have Molex connectors available in the Optical Drive bays -- 2009 and later would require pulling additional 12 Volts off a SATA connector.


    What users who install multiple high-end cards sometimes discover is that over-drawing the Mac Power supply will cause your Mac to do an uncontrolled power-off in mid rendering. That is why the more conservative "add a 450 Watt additional supply" is the recommended solution.


    There was a problem running multiple NVIDIA cards such as the GT120 for a while, but I believe that has been fixed in 10.10 and later. The GT120 will only work in 2008 and later, but it does not require any aux power.

  • John Lockwood Level 5 (7,255 points)

    The 7950 is a power hungry card. You can get adapters to convert the optical bay SATA connectors to molex connectors which you can then convert to PCI power connectors. As per an earlier message in this thread sharing things between the two cards would probably be a good idea so each card would have one real PCI connection and one pci to molex to SATA connection plus each card would also draw power from its PCI slot.


    The other alternative would be the second power supply route.


    See for examples of SATA to molex converters, remember these result in you not being able to use the optical bays. You would have to instead use an external USB optical drive.


    I have no idea if two 7950 cards would overload the 2 x pci + 2 x molex to SATA + 2 x PCI slots capacity, depending on usage e.g. heavy 3d it could be borderline.

  • avner0x Level 1 (0 points)

    I appreciate the speedy responses & advice,


    I have managed to order 3 15 pin sata to pci express 6 pin power adapters off amazon,


    That should allow me to draw power from my unused SATA ports for the video cards.

    I definitely will take the advice of using one motherboard 6 pin port per video card and only use


    one adapter as well so it gets 75 watts from the pci lane + 75 watts from the motherboard + 75 watts from the SATA port at max load.


    I was considering the use of a pci riser card to maximize the amount of pci lanes I can use since the second 7950 will block an additional pci lane but


    decided against it.


    I just found a very cool 7 port usb 3.0 pcie card on amazon that runs off the hard drive bays as well.


    Once the amazon adapters arrive in the mail I will see for myself if the video cards run without artifacting.


    Thank you everyone!

  • John Lockwood Level 5 (7,255 points)

    There is a 4-port version of that card which does not need a molex power connector, it uses power just from the PCI slot. It might I believe run a little bit slower but I think the simplicity is worth it.


    There are only two optical drive bays so ideally you only use two molex converters.

  • Neil Paisnel Level 1 (30 points)

    Grant Bennet-Alder wrote:




    What users who install multiple high-end cards sometimes discover is that over-drawing the Mac Power supply will cause your Mac to do an uncontrolled power-off in mid rendering. That is why the more conservative "add a 450 Watt additional supply" is the recommended solution.






    I always wondered about that.  I have done it on old PC hardware before when I run them as NAS4Free servers.  Find another scrap PC, pull the PSU then short the green wire to the black, via a relay powered from the Main PSU.


    I was just worried about doing that on a Graphics card though.  Since the card would then be receiving power from both the PCIe slot AND the external PSU.  OK so you'd make the grounds common...but the 12 volt and 5 volt lines of an external PSU are unlikely to be exactly the same as the Mac internal PSU.  So you will have slightly voltages feeding the graphics cars..the PCIe voltage and the external PSU voltages.


    Care to enlighten us on how you would achieve the additional 450W  supply as I may need to do this in future too, to keep my 2008 Mac Pro running for a few years longer.



  • Thomas Tempelmann Level 1 (45 points)

    Yes, just make sure you buy one with EFI code in its bios.


    Not necessarily - only one card needs to support EFI because that's only needed for seeing the startup screen, which will only be shown on one display anyway. Once OSX has booted, the EFI support is not necessary and more.


    I for instance, have a Apple-provided Radeon HD 2600 with EFI support and then added a plain non-Apple Nvidia GeForce 9600 in slot2 - it has no EFI ROM support, yet I get the boot process displayed on the ATI card. So I just connect one monitor to the ATI and the other(s) to the Nvidea. I got the card for 15 bucks, so it was dirt cheap - same goes for other GPUs - if you look on ebay, those with EFI support for Macs are usually much more expensive - you don't need those if you leave your original card installed with one monitor connected to it.

  • Grant Bennet-Alder Level 9 (55,350 points)

    There is a 450 Watt power supply that is the size and shape of a DVD Reader, and installs in that cavity inside your Mac. Its power cord can enter through a hole in the backplate of one of the PCI slots (such as the one that is tallest, or that benefits the most.) it can be wired to use a the presence of power on some other lead as an input to turn it on and off with main Mac Pro power.


    The supplied power from any supply is usually a bit above the nominal required Voltage. Making the grounds common goes a long way, but anywhere the power is super-critical, you will find the board has its own on-board power regulation there as well. So it has generally not been a problem. Also, you could do the reverse of what was suggested for cheater cords, and run the entire external power for one card off the external supply, rather than mixing cheater-cord power for one and aux power for the other.

  • Neil Paisnel Level 1 (30 points)

    Yes, that is what I would do, run the second card external power connectors off the external PSU.


    I did eventually find the unit you mention that fits in the CD slot, on Newegg, but showing out of stock.


    I have been running a 2600 and a geForce4870 for a year or more now.  A non-EFI 4870 that I guess I reflashed with Zeus to give EFI support.  I assume I ref lashed it any way, as it shows the boot sequence.  Long ago and I just can't remember

    Just thinking it is time to upgrade again, to try and keep my 08 machine running for a bit longer.




    I don't mind playing with PSU's on ' scrap hardware, like old Intel PC boxes set up as servers, but doing a similar PSU 'hack on my 'baby' does make me a bit nervous.  Frying the main board by sending unregulated voltages to it via a dodgy eternal PSU and graphics card and the PCI slot ...yikes  an expensive mistake to make.

  • Neil Paisnel Level 1 (30 points)



    I have another question regarding the use of a second card.


    I discovered this morning using Cinebench benchmark app that no matter which screen I started the test app from, it only ever tested one card (original ATI 2600).  In my case it was the slower of the two cards installed.

    Had a hunch that starting at the app from the screen powered by that graphics card, bmay then force it to yes that card.....but NO


    I turns out that in order to get Cinebench to use the second card in (a 4870)  I had to go to System preferences | Displays | Arrangement tab and drag the Grey menu bar for allocating the primary screen, to a monitor connected to the more powerful card.


    This got me wondering .  How does the mac allocate which card does the processing for other Photoshop/ Final Cut etc?  Does it really only use the card that is designated as the primary display in Sys Prefs ? is this done "in app" or is it a system wide basis.  Just wondering that maybe the scone card can be any old card, and all the processing is only handled by the Primary matter what the second card is?



    Previously I had set it up so that my two main FCP X edit screen were running on different cards, the theory being that then the load would be shared amongst the two GPU' I am wondering.  is it better to have one app running on the screens powered by the same GPU?

  • Grant Bennet-Alder Level 9 (55,350 points)

    One the dark cylinder Mac pro, all six displays are handled by the ONE card that has display interfaces. Most GPU-intensive computation is handled by the second GPU card that has no display interfaces. This is done automatically, and you cannot change it.


    Anandtech said (in his article on the new Mac Pro) that this is done because GPUs do not handle context-switching gracefully. Basically, if you start a big array transform computation and have to do some other computation to draw or refresh a portion of a screen, you dump the work in progress and refresh the screen, then start over on the array transform.


    From his review of the Mac Pro (late 2013):

    Under OS X the situation is a bit more complicated. There is no system-wide CrossFire X equivalent that will automatically split up rendering tasks across both GPUs. By default, one GPU is setup for display duties while the other is used exclusively for GPU compute workloads. GPUs are notoriously bad at context switching, which can severely limit compute performance if the GPU also has to deal with the rendering workloads associated with display in a modern OS. NVIDIA sought to address a similar problem with their Maximus technology, combining Quadro and Tesla cards into a single system for display and compute.

    Due to the nature of the default GPU division under OS X, all games by default will only use a single GPU. It is up to the game developer to recognize and split rendering across both GPUs, which no one is doing at present. Unfortunately firing up two instances of a 3D workload won’t load balance across the two GPUs by default. I ran Unigine Heaven and Valley benchmarks in parallel, unfortunately both were scheduled on the display GPU leaving the compute GPU completely idle.


    The same is true for professional applications. By default you will see only one GPU used for compute workloads. Just like the gaming example however, applications may be written to spread compute workloads out across both GPUs if they need the horsepower. The latest update to Final Cut Pro (10.1) is one example of an app that has been specifically written to take advantage of both GPUs in compute tasks.




  • Neil Paisnel Level 1 (30 points)

    Ummm, yikes, so in a system with 2 cards and 4 screens, there is no way of really knowing if the more powerful card is taking the workload or not.   That said, my most intensive application is FCP X 10.1. So hopefully the better card is at least doing some work.

    I have a GeForce 8800 GT 512 mb sitting here spare.  OK so old and slow compared to modern cards, and according to some comparisons I have seen, performs slower than the 4870  , but better performing than the original ATI 2600 HD.


    And then there is a difference depending on the task in hand and the processes needed. IE shoot em up game rendering compared to video editing.



    Either way, only one way to know.