Home >
​Cloud mining vs. GPU mining, Which one is better? A Guide that You Can't Miss

​Cloud mining vs. GPU mining, Which one is better? A Guide that You Can't Miss

​Cloud mining vs GPU mining, which one is better? It was difficult to give a definite answer. Both new and used graphics cards are difficult to find or overpriced, partly because cryptocurrency enthusiasts are snapping up every model that can be used for mining, and these models can use a proof-of-work consensus algorithm to confirm transactions.
When considering cloud mining vs GPU mining, we should point out that if you want to do some digging yourself, choosing the right hardware for the task is not easy, not just because of the scarcity of GPUs. Because of the way the crypto mining space works, blockchain developers restrict what you can use. The best mining GPU needs to have enough VRAM support, and game performance is not necessarily a good indicator of mining performance. Power consumption is also a major issue because it will directly affect how much profit you can make.
Before looking at the most suitable graphics cards for mining, it should be noted that although the price of cryptocurrencies has soared to record levels recently, its volatility makes it difficult to predict what will happen shortly. The current situation may only be a repetition of what happened in 2011, 2014, and 2017, and it is likely to collapse again. Cloud mining vs GPU mining, which one is better? Cryptocurrency enthusiasts are used to roller coasters, but you may not be used to it, so it's best to ease your expectations.
We also know that not everyone is happy with the use of graphics cards to mine cryptocurrencies, but the situation will not change much until at least 2022 when Ethereum's multi-stage transition from proof of work to proof of interest will be completed.
You can also think about it from another angle-assuming a best-case scenario, you may recover part or all of the cost of the graphics card, or even use it to mine when you are not using it for games, thereby earning a small amount of profit.

Cloud mining vs GPU mining, which one is better?Top performance: NVIDIA RTX 3090
If you are looking for the best hash rate for each graphics card, NVIDIA's Ampere architecture is a good choice. As far as the RTX 3090 graphics card is concerned, it happens to be very good in terms of game performance and mining performance.
On the RTX 3090, using the Ethereum mining algorithm for mining, with an average power consumption of 285 watts, about 120-125MH/sec can be generated, and some overclocking and voltage tuning are taken into account.
Given that the high price of RTX 3090 is further inflated due to the scarcity of GPUs and scalpers, according to the electricity bill in your area (assuming US$0.10 to US$0.12 per kWh), and assuming that the price of Ethereum will not fall below US$1,900, it is possible to break even It takes 237 to 278 days.
The game time spent on it will also erode your potential profits, but this card will run most games in 4K, except for "Cyberpunk 2077", where even the magic of DLSS will not help it always the speed of 60 frames per second is assembled.

High performance: RTX 3080 and Radeon VII
Nvidia's RTX 3080 is close to RTX 3090, with a hash rate of 95-100 MH per second. Similarly, this can be achieved with proper overclocking, voltage tuning, and cooling. However, the typical power draw for this hash rate range is 220-250 watts.
Cloud mining vs GPU mining, which one is better? Like the RTX 3090, the price is the biggest disadvantage, because you can hardly find it on the $699 management system update project. Between 216 days and 267 days, mining on it is break-even, but once this point is reached, it will also generate rapid profits.
Some of you may not take it seriously, which is correct. This is AMD's second attempt to make the Vega architecture shine but ultimately failed to fulfill its promise of competitiveness with Nvidia's RTX 2080. But thanks to AMD's use of HBM2 memory, this is an excellent performance for mining Ethereum.
Careful tuning can produce 95-100 megawatts per second, with a power consumption of 190-200 watts. Nevertheless, Radeon VII may be difficult to find.

Moderate performance: RX 6000 series, RX 5700 XT, RTX 3070, RTX 3060 Ti, RTX 3060, RTX 2080 Ti, RX Vega 64/56
AMD’s Big Navi will not disappoint in terms of gaming performance, as you can see in our reviews of Radeon RX 6800 and RX 6800 XT, RX 6900 XT is just a slightly enhanced version of the latter. But for mining, its performance is somewhat unsatisfactory. When AMD stated that it had no plans to weaken the mining capabilities of its graphics cards, AMD also confirmed this, just as NVIDIA plans to use some RTX 3000 series chips.
The most you can take out of these cards is 60-65 MHz per second, and the power consumption is 160-190 watts. If you can buy the RX 6800, it is the clear winner among the three in terms of price and energy efficiency, so it should produce the best mining results. Cloud mining vs GPU mining, which one is better? Assuming you can get one, the break-even time is about 180 to 236 days.
For those who have RX 5700 or RX 5700 XT, these two cards can reach 50-56 MH per second, and all power levels can reach 120-155 watts. In terms of hash rate per watt, they are certainly one of the best GPUs, and they are also excellent performers in 1440p games.
The same is true for Radeon Vega 64 and Vega 56, as long as they are properly tuned, they can reach 45-52 MH per second, depending on how much you can overclock the HBM. The power consumption figures of these GPUs vary greatly, as this depends largely on the silicon lottery, but they usually require 130-165 watts. The second-hand cost of obtaining these cards may be higher than when they were released, but they can achieve break-even within 6 months, which makes them less risky given the fluctuations in crypto prices.
Nvidia's RTX 3070 and 3060 Ti are very close in terms of gaming and mining performance, and the latter has a lower cost. If properly tuned, they can reach about 60 MHz per second and consume 120-130 watts. Cloud mining vs GPU mining, which one is better? However, due to the high pricing, the break-even time ranged from 225 days to 242 days. These are the first choice for gaming PC upgrades, so their availability may be affected more.
The RTX 2080 Ti was not of high value when it was launched, and it has not changed. It has always been a great 4K game card, and it also managed to achieve a mining hash rate of 55-60MH per second with a power extraction speed of 155-180 watts. Currently buying one requires an arm and a leg—about $1,300 on an e-commerce company—but after 180-190 days of mining, you can break even.

Honorary titles: RTX 2070, RTX 2060 Super, GTX 1080 Ti, GTX 1060 6GB, RX 580/570/480 8GB
When looking at the latest Steam hardware survey, we saw that the GeForce GTX 1060 still powers many gaming platforms. If you happen to have a 6GB version, you can achieve a modest but still respectable hash rate of 25MH per second, with a lower power of 90-100 watts. The same goes for AMD's Radeon RX 580, RX 570, and RX 480 graphics cards-of course, we are talking about the 8GB variant.
Even if the price is inflated, these graphics cards allow you to break even within 4 to 5 months. Cloud mining vs GPU mining, which one is better? Nvidia's GTX 1660 Super is also worth seeing. It can achieve a hash rate of 26-32 MH per second with a power pump of 70-90 watts.
Even better, AMD's Radeon RX 5600 XT can reach 42MHz per second while drawing 105-115 watts. These are great 1080p game cards. Cloud mining vs GPU mining, which one is better? Assuming you can find them for around US$300-350, they might pay for themselves in 5 months of mining.
RTX 2070 Super, RTX 2070, RTX 2060 Super, and GTX 1080 Ti can all achieve the same hash rate of 40-43 MH per second. However, the power of 1080 Ti is close to 200 watts, while other cards are about 110-130 watts, the difference is not big. They are priced at more than $700 on e-commerce but can play 1440p games, and it takes 6 months to recover the cost through cloud mining.