Log in

View Full Version : Off the Leading Edge ($--)



Tamaster
10-02-2010, 01:58 PM
I love it when I can get a great price on great hardware that is nearing it's End-O'-Life while still being quite excellent for the applications I care about: Not-so-brand-new gaming, Linux and BSD computing and Distributed Computing on a budget. Here's my latest:

Do you remember when the Radeon 48xx family hit? What a *_SPLASH_* that made! Almost a Billion transistors @ 55nm, 256 bit-wide memory @ GDDR5 speed, Unified Superscalar Shader architecture X 800 stream processing units resulting in over a TeraFlop of mind bending performance from a SINGLE card! It still makes me break a sweat thinking about the exceptional restraint that I had to excercise just to keep from running out RIGHT THEN and buying one! But I won that contest...

That was a mighty big Can o' Whup-Ass that AMD/ATI busted out! That Tek changed the way games were written. It even changed the way Operating Systems and other applications are being written today. And it bought AMD the time and revenues that it required to go back to the big drawing board and work on the next leap-frog generation that they are about to drop on the market very soon. We will NOT be disappointed, cause they have been listening to their enthusiasts (that's *_US_* folks here!

Back to my story...

Bang for Buck. I live that... I've wanted to get one of those 800 stream cards for a while. For UNDER $100US. And up until now, that's been the hard part. Between the 4890/70/50 the prices didn't really drop much after the 5xxx series was brought out BECAUSE the market demand was still so strong. But time, the great equalizer, was on my side this time. While I would love to have a '90, the power draw is a bit steep and the '70 is still about $30 more than I want to pay. Which bring us to the '50...

What I really wanted (and didn't get) was a 650Mhz 4850 w/256Mb GDDR4 clocked @ 2Ghz (effective). The GDDR4 has a much better power envelope than either GDDR3 or GDDR5 while still performing well and the smaller buffer would have made it possible to have the card run off of slot power ONLY (yes, it's possible). Unfortunately, GDDR4 just didn't take off, so the above described card is 'UnObtainium'...

What I'm getting is a 625Mhz 4850 w/512Mb GDDR3 clocked @ 1.4Ghz (effective) for UNDER $100US. I even have a $15 rebate from the manufacturer which brings the price down to (drum roll please)

$79.99 for an 800 stream half-gig 4850

It's going into a dual-core 5600+ based 64-bit Linux cruncher that runs BOINC 24/7/365 that I built up from spare parts (literally in my basement) that will live out it's life doing good things. Because I can...

There are bargains to be had out there people. You just have to watch for them... :icon_wink:

Tamaster
10-02-2010, 10:22 PM
And in case you're wondering why I didn't pay another ten bucks and go with a similarly spec'd 4870 (yeah, go check THAT out) it's because of the power draw. It's an efficiency thing folks. If you are running @ 100% load 24/7/365 the heat and power take it's toll. On the card, PS, mobo and everything else in the case. I'm doing this on a budget and I pay the electric bill...

Dirk Broer
10-03-2010, 09:33 PM
Yeah, the elctricity bill, I fear that one as well. I have 5 PCs 24/7/365, including one with a GTX260, and my Kwatt/year has gone up 40% as compared to last year....I fear I have to look into better performing CPU/GPUs per watt pretty soon. On the brighter side: I just bought a HD3870 for just 45 Euro's, so now for a PCIe board that goes with it....

Tamaster
10-04-2010, 01:42 AM
One of the things that I've invested in is a Kill-A-Watt EZ power meter (model P4460) which allows me to measure pretty accurately the outlet power line draw of the systems that I have (and more importantly, how that power consumption changes when I do hardware upgrades). You can program in your cost per kilowatt/hour and it can tell you (in real time) how much it costs to operate per hour, day, week, month or year. The system that I do most of my desktop work on once was drawing over 500 watts until I:

1) upgraded the CPU to newer technology chosen for performance per watt
2) upgraded the single video adapter to a newer cross-fire pair chosen for efficiency
3) upgraded the PS to a higher capacity but also much more efficient (Gold) rated unit

I've now got higher performance (roughly ~10% increase) with a 200 watt decrease in power (roughly ~40% decrease) but I never would have known it if I hadn't measured before and after.

It's worth buying one if you are a committed cruncher...

NeoGen
10-04-2010, 07:18 AM
I have seen images of those Kill-a-wat things... I could really go for one, it seems to be a really nice tool for dedicated crunchers like us to have... and especially more for folks like vaughan, Liuqyn, or Mitchell that have tens of computers running 24x7. :)

mitchellds
10-04-2010, 02:41 PM
I have one of those gadgets, but its just confirms what I already know. It takes alot of power and bucks to keep running my toys. I wonder how much power I could get off a home-built tidal power generator? I do live on a river that has a very reliable 3 ft tide change.

Nflight ?


I have seen images of those Kill-a-wat things... I could really go for one, it seems to be a really nice tool for dedicated crunchers like us to have... and especially more for folks like vaughan, Liuqyn, or Mitchell that have tens of computers running 24x7. :)

Ramjet
10-05-2010, 03:45 AM
[QUOTE=Tamaster;68757]One of the things that I've invested in is a Kill-A-Watt EZ power meter (model P4460) which allows me to measure pretty accurately the outlet power line draw of the systems that I have (and more importantly, how that power consumption changes when I do hardware upgrades).

Speaking of Kill-A-Watt, if anyone is interested in acquiring one of these, many libraries have them available to check out, also Xoxide has a pretty good deal on them right now. :)
http://www.xoxide.com/p3-kill-a-watt.html

Tamaster
10-05-2010, 10:35 AM
That is an excellent price BTW. You can easily recover it's cost if it helps you identify and upgrade *_JUST ONE_* piece of equip or appliance. Think of it that way...

NeoGen
10-05-2010, 01:10 PM
I don't have one of those Kill-a-Watt toys yet, but I already know I have several very non-energy-efficient (and old) appliances at home... I'm just waiting for them to break down so that they be replaced. :icon_razz:

spikey_richie
10-07-2010, 09:44 AM
I have one of these http://www.diykyoto.com/uk - have had it for about 3 years and it's awesome. You can't buy them in the USA though which is a shame. My AMD X2 6000+ runs at £350/annum to run at 100% CPU.

Dirk Broer
10-07-2010, 05:36 PM
£350/annum? Now I understand why my girlfriend asked to close down "1 or 2 of those Boinc machines". If my 5 machines run as expensive as yours they should be costing roughly 2275 euro/annum...Next time Boinc comes asking for money from me I will return the favour.

Tamaster
11-25-2010, 03:00 PM
After getting this card rolling in stock configuration, it is able to earn around 50k credits/day from PrimeGrid Sieving with this 2.8 Ghz dual-core on 230 watts of power ($0.33/day) which I've invested about $250 in parts.

I'd rather be crunchin' computational biology type work, but until more projects get ported to OpenCL (which is supported by *_BOTH_* Nvidia and AMD) this will have to do...

vaughan
11-28-2010, 07:05 AM
Tamaster I agree we are favoring the Math projects with the GPU projects. I have put some of my CPU cores on the Biology / Medical projects to try to redress the balance a bit.

AMDave
11-28-2010, 08:47 AM
/ed -
half an hour later I remember it all now
the proof was that
1) the Bio projects would not achieve the same speed-up as the maths projects
2) the GPGPU capable GPU's do not have the same proliferation as CPUs yet and won't have for a long time (if ever)
3) so investing in fine-tuning the CPU apps was going to bring bigger net-gains in the short AND medium term
the point was that GPGPU does not bring all things to everyone
there was also some factual and sensible evidence behind the conjecture and it, sadly, made good sense.
- ed/


Actually, in spite of DA's stance on waiting longer before developing GPU clients for SETI, I am very surprised that IBM have not forwarded GPU client development on WCG.
(Don't forget the December run on WCG is about to start!!!)
Until WCG catch up (and they will) we will be crunching mostly math on the GPU, I think.

vaughan
11-28-2010, 12:36 PM
Until WCG catch up (and they will) we will be crunching mostly math on the GPU, I think.
Don't forget Folding at Home, its a shame its not BOINC.

Tamaster
11-28-2010, 01:29 PM
Well... Folding@home is getting the benefit of my crossfire pair of HD2600XT adapters on my main computer to good effect. And I'm running WCG on *_ALL_* my computers, including the one that I just added PrimeGrid GPU crunchin' to. Right now it's mostly Cancer research (which I have a personal stake in), Muscular Distrophy (which I also have a personal stake in) and Clean Water and Energy (which we *_ALL_* have a stake in). I'm not knocking the math, cosmology or climate research arenas (I've contributed to all in the past, and will again) It's just the lack of common platform support (both sides are at fault in my not so humble opinion) driven by a desire to glean every last bit of performance. Hardware Abstraction Layers cost MIPS/FLOPS. Fact of Life. But the benefits *_FAR_* outweigh the cost in distributed computing (also IMNSHO). Granted, there are various hardware limitations to overcome, which you can adapt by software (CPU) emulation to fill in the gaps. No one has done this. Yet. It's on my wish list...

Tamaster
11-05-2011, 09:05 PM
One of the reasons that I've been MIA is that my place suffered a massive electrical surge that took out some electronics.
(and two computers, one of which was running GPU work for PrimeGrid)

I'm still sorting out this mess and it might be a while before I get to the bottom of it.

Dirk Broer
11-05-2011, 09:40 PM
Hi Tamaster,

That is bad news. I hope you can claim your damage with the involved power company.
Does not get you your lost credits back, but something is better than nothing.

NeoGen
11-06-2011, 01:03 AM
Sorry to hear Tamaster

I've been fortunate that over here we only have a power outage 2 or 3 times a year and its usually just that, power goes down but no electronics is affected.
Best of luck sorting it out, and like Dirk said I hope you can claim the damage with the power company.