Remember when I said that I wouldn’t delid a G80 core?
Yeah…
Me neither. So, I delided it. Multiple ones to be exact and also I have only one thing to say. Risk is completely not worth it.
Also, I got a GeForce2 Ti, or at least a cheaper variant of this GPU, “VX”. I honestly didn’t know that it existed, but You live and You learn.
Case 1: 8800 GTX numero uno
OK, I kinda lied that I don’t remember about delidding this GPU. It was only the previous post after all. I was too curious not to try. Surely enough, using a bit modded technique from this tutorial, I succesfully managed to delid a dead core. With that done I typed the worst looking one for a ‘test GPU’.
I don’t remember doing anything to the core, so abundance of flux behind the metal cover got me a bit confused at first. Must have been the dead VRAM that got the previous owner to “reflow” the core with a heat gun or something. You can’t really see any of it on this picture, but it will reveal itself once IHS is gone.
Let me tell You one thing: If You ever did a delid on a PS3 and found it stressful then I dare You to do this. It’s not particularly difficult, but epoxy used here is beyond hard and the thought that one wrong move and GPU is dead is still lingering in my mind.
I fortunately managed NOT to destroy any capacitors and even not damage the substrate at all. You can also really see the yellow-ish flux residue now. Notice the pattern of old thermal paste here, it’s good on the middle, but corners had little contact with IHS.
90nm cores naturally ran hot, so they usually change the silicon color to plum-like violet, which seems to be the case here. But that’s not really important, what is is carafully removing edges of the epoxy, so that IHS will seat lower and, hopefully, mitigate effects of uneven surfaces as well as increasing the mounting pressure. If You ever do that on your own remember to be extremely careful – epoxy is VERY hard and You likely will need to use a metal knife to grind it. It’s extremely easy to cut traces on the PCB this way, but I don’t know any better methods to do it.
To lower the IHS even more I decided to lap its edges. This really shows how uneven these were manufactured. Also, I did this on sunday, which means that I couldn’t buy any lower gradation paper and the only ones I had were 1200 and 1500. IT TOOK ME AGES TO GET TO THIS POINT.
Anyway, when I test-fitted it back together IHS was rocking a bit, which meant that I had to remove more epoxy. After doing that I couldn’t notice any more movement, so I put it back together and went for a test drive.
At first success – it was working.
Not for long though. My heart sunk after it started artifacting.
Of course. Why wouldn’t more VRAM fail?
Close-up inspection of TIM distribution revealed that it “Oh, it even spread out correctly”, as I said to myself.
Quick BGA rework later I was back in Windows.
“After replacing all that crusty thermal paste temps surely are better”
NO.
They are the same. I kept the test going and got literally the same results as with non-delidded one.
This procedure is a giant risk for basically no return. And don’t even get me started on wasted time.
Case 2: 8800 GTX numero dos
“Last one might have been a fluke.” – that’s what I thought when I was disassembling the second one.
This GPU is only a month older than the previous one, but it’s in a much better condition, both visually and ‘spiritually’.
I had to chose a different corner, because, as You have already seen, these IHSes were REALLY not precise. What’s more knife was way easier to insert here compared to the previous one.
OH S**T.
I don’t really know what’s worse: the fact that I knocked one capacitor off or lack of the epoxy.
I’ll start with epoxy, or silicone in this case. It’s more confusing than bad, but I have never seen silicone used here, was this a Fab choice and I got lucky, or maybe someone has already been here?
As for delid difficulty, it’s more difficult than epoxy since You have to cut almost every edge before You can remove the IHS, whereas epoxy just pops off when there is enough knife inserted.
Back to the really bad thing – missing capacitor. I measured it and it’s filtering VRAM voltage. Very likely not necessary, but I wanted it back.
After some struggling (I MUST BUY A MICROSCOPE!) I managed to solder it back onto the substrate.
It’s a bit shifted, but my not so steady hands back then still managed to fix my mistake.
Look at the TIM pattern, it’s the same as on the previous one – good coverage, but there wasn’t enough pressure on the corners, which made them plum-like violet. Interestingly center of the core is still silver-ish.
Fortunately I didn’t have to lap the IHS this time, so I decided to put it back together, but with a twist this time.
And here goes the fair comparison – I’ve got multiple variables now, but to be honest I don’t care. I want the best temps, at least by semi-reasnable means.
So far so good. It’s working just fine.
Any noticable differences?
Yes.
No.
Not really?
Temperatures are lower, fan (custom curve was used for all tests) doesn’t go to 100%, but there is a giant BUT.
Ambient temperature was lower. Lower by a lot. I mean almost 15°C.
In conclusion – don’t do it. It’s not worth it. Just water cool it.
Case 3: GeForce2 TI, or is it??
I got this card for almost free, it would be a sin to pass up on occasion like that. Let’s restore it.
Normal GPU, right?
NOPE. What even is a “Ti VX”? And why does it have “GTS” marking.
After a bit of googling I don’t know why it’s a gts, but I know that it’s a lower clocked variant of GF2 Ti, by 25MHz. Doesn’t seem like that’s much by todays strandards. Until You read that normal Ti was clocked 250MHz, which means that 10% of core clock is gone.
I will try and “fix” that mistake later.
Pretty satisfying, isn’t it?
OK, so I tested it in 3DMark99, this is the score I got. Now for the fun part – I managed to clock the core up to ~310MHz, which, accoring to anandtech, is a meh result at best. There is one problem. VRAM. I could only clock it at 440MHz, which is basically trash.
With that in mind I decided not to do a BIOS swap, as the core OC gave me almost nothing in terms of performance.
What I did though was comparing it to my GF2 Pro.
Now, I theoretically could order -QC40 VRAM (it’s technically normal DDR, but here it acts as VRAM) instead of installed -QC50 chips, but Chinese suppliers aren’t really known for their fair practices and ordered ICs very likely would have been a repainted slower chips.
Thanks for reading!