Please do not PM me with questions! Questions via PM will not be answered. Post on the forums instead!
For service manual, schematic, boardview (board view), datasheet, cad - use our search.
In creating turbulences and incredible noise? The air has to move away as it is pushed. Now by not restricting it, you will get twice the airflow on one half of the supply (on the side where the air is leaving PSU). There may be lower temperature on this part. Isn't it better than having the same (higher) temperature on the whole PSU?
Less jewellery, more gold into electrotech industry! Half of the computer problems is caused by bad contacts
Plastic costs money, there is a reason they include it. And it's not "just cause we can".
Please do not PM me with questions! Questions via PM will not be answered. Post on the forums instead!
For service manual, schematic, boardview (board view), datasheet, cad - use our search.
It creates a difference in air pressure which keeps air pockets from being created so the air actually moves out of the power supply.
You can experiment with some cigarette smoke and low rpm and by replacing the lid with the fan with one that's transparent so that you can actually see the air trapped there ...
Looks like a path blocker, to preclude the air from going straight for the exhaust grille without cooling any component.
It helps directing the air to where it's supposed to go (the PCB), but it's also an aerodynamic restriction that creates turbulence and consequently noise.
Decent Andyson unit. Not 500W though. Good heatsinks, nice input caps for a non PFC unit, GBU806. I think it's good for 350W, the transformer probably couldn't handle more than that. 30A schottky for all rails. Gonna recap it and put it in a build with a Core 2 Duo and a GTX 260, should be perfect for that. What do you guys think about it?
Andyson units are alright. I've seen 400W Hipros (with a single 80mm fan) well capable of their labeled rating and they only had 35 size transformers (Coolermaster's RS-430-PMSR/P, which is a rebranding of Hipro/Chicony Power's HP-P4507F5W/P). However, for 450W+ PSUs, it's probably best to use a 39 size transformer or better.
The GTX 260 doesn't take much more than 200W+ from the +12V rail, if I'm not mistaken, so if that Core 2 Duo doesn't take much more than a Pentium 4 (the lower end Core 2 Duos didn't, to my recollection), it should be okay.
^
It's before the fuse, so no, it won't blow the PSU fuse, and faulty house breakers/fuses are more common than you might think. Having said that, what would probably happen if it were to short is that the whisker would just melt.
I love putting bad caps and flat batteries in fire and watching them explode!!
No wonder it doesn't work! You installed the jumper wires backwards
Main PC: Core i7 3770K 3.5GHz, Gigabyte GA-Z77M-D3H-MVP, 8GB Kingston HyperX DDR3 1600, 240GB Intel 335 Series SSD, 750GB WD HDD, Sony Optiarc DVD RW, Palit nVidia GTX660 Ti, CoolerMaster N200 Case, Delta DPS-600MB 600W PSU, Hauppauge TV Tuner, Windows 7 Home Premium
Office PC: HP ProLiant ML150 G3, 2x Xeon E5335 2GHz, 4GB DDR2 RAM, 120GB Intel 530 SSD, 2x 250GB HDD, 2x 450GB 15K SAS HDD in RAID 1, 1x 2TB HDD, nVidia 8400GS, Delta DPS-650BB 650W PSU, Windows 7 Pro
If it shorts out the power supply fuse pops, or the house fuse trips.
Big deal. One power supply dead in 10000, not even a drop in the ocean for that power supply's failure rate.
You're seriously getting boring with your crusade against hard drives and leaded solder.
Man have you even seen ANY estimates of how incredibly huge damage to whole industry and all customers the RoHS stuff made?
Because on one hand, since it has been pushed, lifetime of electronics really have dropped to „warranty and bit“. Yeah, it may used to be before as well, but now, look, waste most of laptops, graphics cards, motherboards and pretty much everything. Here, the market of used parts is almost dead because almost all more powerful graphics die in like 3 years.
Thinking about that, you can't maybe blame display and TV makers for using crap caps. First, because of RoHS, really everybody is doing that amongst whole electronics industry. Second, what warranty you have it won't fail because of the bloody solder even if you used good caps? We see cracked joints all the time. Now we even see whiskers in real products, not just on images from NASA.
Now on the other hand, manufaturers do not take THAT MUCH advantage of selling electronics like cakes now, because they have invested trillions in replacing the leades solder and it is not yet finnished. Simply there is still not yet ANY type of non-leaded solder which could replace it and does not have serious problems.
Another whisker (though not in the picture, and I broke it off) was almost long enough to short earth and neutral. Incorrect wiring could make it blow.
I agree with Behemot. There is NO lead-free solder that works as well as SnPb. Go on mariushm, get that magnifier out and tell me you can't see whiskers in any of your equipment. And give me some nice high-quality photos to prove it.
(On a side note about reliability, I don't care how equipment holds under "typical use" if that means light use, in a cool environment, for a relatively short time. For the record, I do use big HDDs for something, but for most stuff I prefer known reliability. Funnily enough, the one big HDD I have that survived two years has perfect S.M.A.R.T. But I guess I just got lucky with that one. Mind you, it's a Seagate. Did I win the lottery??? )
PS. I don't think you're using proper quotes, Behemot. But I still agree with what you're saying.
I know there are whiskers, I know they're not predictable, there's nothing I can do about it. It's a very small percent of hardware that has whiskers, and an even smaller percent of hardware that dies because of it.
Leaded solder DOES NOT FIX WHISKERS... as long as there's TIN in the solder, you're going to have whiskers. Leaded solder just decreases the growth and the frequency of whiskers popping out.
Behemot : I've read it, I know about it, I did my own research as well.
Yes, like I always say, there are damages caused by RoHS and because lead solder is no longer used but you're blowing their importance out of proportion.
There's far more waste caused by planned obsolesce, cheapening out on other parts in general, designing stuff to fail after warranty and so on, that the solder issue is ridiculously small in the scheme of things.
Very small percent of failures on a product is caused by bad solder joints and even smaller percent is caused by tin whiskers.
And for some products, it's a compromise and you grudgingly have to accept partially that it's planned obsolesce - after all nobody's going to make a video card with high quality capacitors and perfect heatsink and so on, when they make new designs every year and have to keep it cheap (just to take your example). They have to sell something new and cheap every year otherwise they'd sell $3000-5000 video cards every 5-8 years and you wouldn't like that. Actually they wouldn't even sell, because everyone would buy $1500 consoles instead. Compromises.
You have to look at the bigger picture. Concentrate your energy into something else, there's no point in blaming everything on rohs and solder (and new hard drives) like Shocker seems to do.
Just off the top of my head, I can think of more serious issues than solder, for example dust, fans and humidity... I don't see anyone complaining that they're making tvs now that die when you splash them with a bit of water.... water leaks onto the bonding tab and causes hydrolysis and the copper traces corrode to the point contact is lost like in this video :
But heavens forbid for you guys to get one of these monitors on your bench and spot a tin whisker or cracked joint... that's all you're going to see.
Just the same, I see very few here complaining about the quality of fans in hardware like power supplies.
Everyone seems to understand fans are something consumable, that their oil dries out, that it's acceptable to use sleeve baring, that they need to be replaced after a while, but no... solder must be perfect, permanent, it's not allowed to have tin whiskers or anything.
Nobody argues that a few dollars worth of dust filters on a computer case could save tons of computer hardware from the trash dump.. but complaining about solder and tin whiskers is fun.
Shit breaks, and it breaks for lots of reasons, just don't ignore all the other reasons to focus strictly on solder. That's all I'm saying.
PS. And you guys should put on your thinking caps and think, maybe the solder issue is also aggravated by other factors like humidity and heat.
Back then when leaded solder was used, what kind of electronics were there? How much current were they using, how hot were the components?
Can you compare a power supply that outputs 100w at best with a power supply of the current days that needs to output 650w on demand?
Did you have back then components like video cards that can be idle at 10w in Windows and suddenly go at 200 watts in games?
Were those computers vibrating as much as they vibrate now? Did that hardware use 3-5 fans (that vibrate) and several hard drives? No, some had only one fan on the power supply and a tiny passive heatsink on the cpu
People like Shoker keep saying how good old hardware was, with little consideration for how the environment and demands of the current hardware is now compared to back then.
Different hardware, different failure points and reasons, different accelerators for failures.... anyway, I'll stop here.
marius - I'll give you one simple fact which will render your arguments null:
Even a 3% proportion of lead alloy in tin solder vastly improves its qualities not just in terms of whiskering but also in terms of fracturing.
Yes, high-proportion tin solder is necessary sometimes for BGAs with fine pitch lead, but what is the reason for not allowing a tinly bit of lead in the alloy? What is this eco-communist obsession with lead-rein alloy?
"We have offered them (the Arabs) a sensible way for so many years. But no, they wanted to fight. Fine! We gave them technology, the latest, the kind even Vietnam didn't have. They had double superiority in tanks and aircraft, triple in artillery, and in air defense and anti-tank weapons they had absolute supremacy. And what? Once again they were beaten. Once again they scrammed [sic]. Once again they screamed for us to come save them. Sadat woke me up in the middle of the night twice over the phone, 'Save me!' He demanded to send Soviet troops, and immediately! No! We are not going to fight for them."
It's a very small percent of hardware that has whiskers, and an even smaller percent of hardware that dies because of it.
At least the first one is TOTALLY wrong.
The HDDs I used in my example in another thread were functional. In any case, I don't think they would likely fail in such a way as to cause rapid whisker growth. Had they been abused, so be it. If the HDD itself doesn't fail, I don't want the solder to fail under the same conditions.
And for the record, I DON'T have 10,000 power supplies. I probably don't even have 100.
There's far more waste caused by planned obsolesce, cheapening out on other parts in general, designing stuff to fail after warranty and so on
I don't disagree with that. But the problem arises when we come to otherwise high-quality stuff.
And for some products, it's a compromise and you grudgingly have to accept partially that it's planned obsolesce - after all nobody's going to make a video card with high quality capacitors and perfect heatsink and so on, when they make new designs every year and have to keep it cheap (just to take your example).
I DON'T accept it, because I want a PC that will last the decade with no maintenance besides dusting. If it's enough for what I'm doing, I won't upgrade. End of story.
$3000-5000??? Yeah right, the cost is still much the same if you get a decade instead of a year.
The reason for these tiny steps is because they have to keep up with the competition - nothing else. If there was no competition, they could introduce a groundbreaking part every five years and I couldn't be happier.
You have to look at the bigger picture. Concentrate your energy into something else, there's no point in blaming everything on rohs and solder
You seem to be assuming that we're only thinking about the things we're explicitly mentioning.
We've already mentioned the usual suspects (bad caps, seized fans, overheating) 1,000 times over, so it would be extremely boring to mention them again.
Just the same, I see very few here complaining about the quality of fans in hardware like power supplies.
See above
Oh, by the way - it's okay if your research disagrees with mine, but you can't say that means your research is the correct research. I'll take an in-depth look over distributor statistics any day of the week.
I strongly disagree with the VGA part. There HAVE been video cards with higher consumption and they HAD tiny heatsinks many times with tiny noisy fans. I have seen many of them being completelly full of dust preventing it from almost any cooling. And yet they still work just fine. Those are even cards from GeForce 6 and 7 era.
Now? You buy new mainstrem card with like under 100 W TDP, very comparable with the old cards, use huge heatsink and fan on that and it dies after 3 years, 4 most anyway. Why? Is it planned obsolence? Oh come on man, people always bought new cards because of new features and more power even with leaded solder and many of them still work these days. As for prices, looking at todays graphics prices, I don't think it is anywhere near to neing low, do you?
Also that older cards were later throtlling, but were still not lowering their voltage in 2D. It means they had high power consumption all the time and were producing heat all the time. Now for me, that's more serious problem then heating it up and than cooling again. Anway, using leaded solder which has better mechanical propertios, it would not be any problems, even with constant cooling and heating the joints would handle that.
There are people and even whole companies doing reflows and reballs, growing like mushrooms after rain last couple of years. Repairing anything from video cards to playstations. Don't talk about planned obsolence, that bullshit. It just dies because of the bloody solder.
One thign which I treid to point on is that THERE WAS ALMSOT NONE planned obsollence before RoHS. Is it just a coincidence? Or is it just because even if they tried to design the product to last long, it would fail because of the solder?
I DON'T accept it, because I want a PC that will last the decade with no maintenance besides dusting. If it's enough for what I'm doing, I won't upgrade. End of story.
Well wake up and smell the coffee, the companies that make hardware don't want you as a customer.
$3000-5000??? Yeah right, the cost is still much the same if you get a decade instead of a year.
A fab costs billions of dollars. That's why few companies can afford to have them, that's why AMD sold their GlobalFoundries fab and do processors now at TSMC.
Development of a processor or a gpu chip takes 1-2 years... do the math on how much it costs to pay hundreds of engineers for those 2 years... right now they're developing graphic chips that will appear in 2015.
The actual fabrication of a processor or gpu can take a few months and if the first spin has too many flaws, it takes another quarter of a year to see the results.
To make chips in giant steps means the companies would need to have giant budgets to survive for years and get small profits from whatever they sell.
The reality is that they'd release even more often and with smaller improvements but it's not possible due to how much is takes to manufacture.
If you want them to release every 5-10 years they'd have to recuperate all the money they spent through those years and bring profit to all their investors who didn't get any dividends for 5-10 years.
The reason for these tiny steps is because they have to keep up with the competition - nothing else. If there was no competition, they could introduce a groundbreaking part every five years and I couldn't be happier.
No, they're compromises to keep products cheaper and competitive.
Each processor/ gpu development is targeting a particular manufacturing process, designing their chips to work with other ICs and so on.
For example, AMD can start designing now a processor but it has to think about where they'll fabricate it. If they want to use TSMC, they have to ask them what plans they have for the future and so on.
For example, let's say TSMC uses 40nm process to fabricate processors.
AMD may want to move from 40nm process to 28nm process in one shot 5 years from now to keep you happy, but TSMC can only promise they'll move to 32 nm in 2 years and maybe jump to 22 nm in 6 years.
That's because it can cost 2-3 billions of dollars to upgrade the fabs to 22nm, but it may only cost a few hundred millions to move to 32 nm in a 2-3 years, then spend a few hundred to go to 28nm in 5 years, then either upgrade or build a new factory for 22 nm. With such large jumps, like going from 40nm to 22 nm, it's often cheaper (but still in the billions of dollars) to just make a new fabrication plant.
Both AMD and nVidia were screwed by TSMC in the past because they promised a manufacturing process was ready in 6 months and it took them 9-11 months to refine the process to the point where it was worth fabricating something as complex as processors (the failure rate was too high)
It's just too hard to explain properly, but what you want, just can't be done in today's world, with today's expectations from hardware (by most people)
I meant the cost to me in the example you mentioned.
By the way, I'm not too keen on sacrificing reliability to save power.
In the HDD case, I was using both old and new under similar conditions. Also, old hardware may have consumed less power, but it also had less cooling.
Tell me where Seagate admits that the 7200.14 is that much less reliable than the Barracuda ATA IV. Here's all the reliability-related specs from the manual:
Temperatures in C, altitude in feet, shock in G @ 2ms
LD = limited displacement
Vibration in inches = displacement
Vibration for 'cuda IV is specified as zero-to-peak
Everyone else agrees with me that whiskers are a common problem, so stop arguing. You have a right to your opinion but you can't say our opinion is wrong without solid proof.
EDIT: And I never said whiskers were the only problem with lead-free solder.
Last edited by Shocker; 01-11-2013, 12:22 AM.
Reason: forgot humidity
Comment