I would say for the sake of "science" and our sanity that you try Clockingoff's example and Boot Camp a Windows install (not a virtual machine). If the Windows install runs fine natively on your Mac's hardware, then we can eliminate your card being a problem. All you have to lose is time (which is precious of course), and you could potentially save $150.
All I can say is I've always been suspicious of OS upgrades since the days of Windows 95 and beyond, which were always so blatantly problematic compared to fresh, non-upgraded installs (I'm assuming you upgraded to Mav). I let those suspicions slide with OS X after going through multiple successful upgrades, but Yosemite has rekindled those suspicions. There were some weird quirks when I first upgraded to Yosemite (booted into a black screen the first time after installing, thought I had bricked my system, but the good ol' restart-until-it-works trick worked). Who knows, it was okay for about a month after that, and maybe that's all I'll get now, too, but GpuTest's FurMark has been running for several hours now, and it's still kicking even with me mashing on Launchpad or Mission Control. I'll report back in a month if I've encountered no problems (and certainly sooner otherwise).
Edit: for anyone else reading this, during reinstall, I chose not to restore from a Time Machine backup just to avoid any old contamination, but rather had Yosemite completely install first, then used the Migration Assistant to import my files from the TM backup. I'm still skeptical that that avoided any contamination, but it seems to be fine so far. We'll see....