Page 1 of 2

Own hardware vs. cloud computing

Posted: Wed Apr 08, 2020 6:38 pm
by iceman1992
If I want to contribute as much compute as possible for some given amount of funds,
which is more cost effective: buying hardware and running it myself, or just buying cloud GPU compute time?
What cloud GPU providers are popular, and most cost effective?

Re: Own hardware vs. cloud computing

Posted: Thu Apr 09, 2020 10:00 am
by PantherX
I believe that AWS, Google Compute and Microsoft Azure are not cost effective at all (from a personal perspective)

I remember reading somewhere that you can rent GPU "mining" boxes but instead of mining, can fold on them at a reasonable price. I can't remember what the site is but hopefully, the person will see this topic and post it :)

Re: Own hardware vs. cloud computing

Posted: Thu Apr 09, 2020 10:13 am
by iceman1992
Ah I found that one platform is vast.ai
Can someone explain how they can be so affordable? I am seeing a system with Xeon E5-2620 v3 with 4x RTX2080Ti, rentable for $0.722/hour.
How on earth is that possible? It's 75.7 TFLOPS of GPU compute, I can get over 10 million FAH points just paying ~$20.

And someone who has tried folding on vast.ai, mind giving a short guide on how to set it up?

Re: Own hardware vs. cloud computing

Posted: Thu Apr 09, 2020 10:31 am
by v00d00
It would be interesting to see a case study based on what could be run from a 8kw solar system as part of a household as well. If the power was generated for free and you maximised production by using low wattage cards like those mining 1060's @ 75w maybe into a low power 12-24v setup using a PicoPSU. How viable would that be as a long term, 'fire and forget', style folding solution. Connect them in directly to the battery bank and not via the inverter.

Re: Own hardware vs. cloud computing

Posted: Thu Apr 09, 2020 5:38 pm
by Endgame124
v00d00 wrote:It would be interesting to see a case study based on what could be run from a 8kw solar system as part of a household as well. If the power was generated for free and you maximised production by using low wattage cards like those mining 1060's @ 75w maybe into a low power 12-24v setup using a PicoPSU. How viable would that be as a long term, 'fire and forget', style folding solution. Connect them in directly to the battery bank and not via the inverter.

I have a 9kw Solar system on my home, and on average am producing 6kwh / day surplus while also accounting for all my home usage and folding with a 1080ti and running Rosetta at home on 4 older systems (q9650, A10-5800k, A10-7870K, i3-370m).

I have a EVGA 1660 super on the way to experiment with best PPD / watt - depending what I find, I may step up to a 2060 super

Re: Own hardware vs. cloud computing

Posted: Thu Apr 09, 2020 7:20 pm
by iceman1992
How much did your solar system cost?

Re: Own hardware vs. cloud computing

Posted: Thu Apr 09, 2020 9:00 pm
by Endgame124
About 20k after incentives, which means the payments are similar to my average power bill.

Re: Own hardware vs. cloud computing

Posted: Thu Apr 09, 2020 10:18 pm
by Jorgeminator
iceman1992 wrote: How on earth is that possible? It's 75.7 TFLOPS of GPU compute, I can get over 10 million FAH points just paying ~$20.
You can make a lot more than 10M points with $20 if you go for the interruptible instances. I made 2M points with just $0.97 out of the free $1 trial. :D

Re: Own hardware vs. cloud computing

Posted: Fri Apr 10, 2020 5:01 am
by iceman1992
Jorgeminator wrote:
iceman1992 wrote: How on earth is that possible? It's 75.7 TFLOPS of GPU compute, I can get over 10 million FAH points just paying ~$20.
You can make a lot more than 10M points with $20 if you go for the interruptible instances. I made 2M points with just $0.97 out of the free $1 trial. :D
I haven't really looked into it, how do those work? So it's better to go for interruptible instances? How do I setup F@H on it?

Re: Own hardware vs. cloud computing

Posted: Fri Apr 10, 2020 8:54 am
by Jorgeminator
I don't know the technology behind it, but you rent an instance and SSH into it. Then install F@H like you would on Linux.
An interruptible instance is up for bid for other people while you're renting it. That means you can get outbid and your instance will be paused until the higher bidder has finished using it. For example, I rented two instances, an RTX 2080 Ti and 2x GTX 1080 Ti for around 5 hours, both instances cost about $0.10/h each. They were never outbid in that time.

Re: Own hardware vs. cloud computing

Posted: Fri Apr 10, 2020 10:05 am
by iceman1992
That simple? I read somewhere that you need to setup docker on it?

Re: Own hardware vs. cloud computing

Posted: Fri Apr 10, 2020 10:12 am
by Jorgeminator
It's that simple. I used the nvidia/opencl:devel-ubuntu18.04 image for the instances.

Re: Own hardware vs. cloud computing

Posted: Fri Apr 10, 2020 10:59 am
by iceman1992
So you didn't use docker? Okay, I guess I'll try it out after they figure out the overload. No use renting machines if they'll just idle

Re: Own hardware vs. cloud computing

Posted: Fri Apr 10, 2020 11:06 am
by PantherX
I would be keen for anyone to document the exact instructions to get this to run once GPU WUs are back to being reliably served :)

Re: Own hardware vs. cloud computing

Posted: Fri Apr 10, 2020 11:53 am
by Neil-B
Given my kit is strictly CPU I would be interested in a "Fools Guide" (being one of said individuals) so I could give it a try … Thanks in Advance :)