I have no sympathy for miners, but purely from software freedom perspective this is maximally evil: closed-source software that decides what you can use your hardware for.
The CMP version is a straight-to-landfill product. As soon as it stops being profitable for mining it will have zero resale value, and become e-waste.
> The CMP version is a straight-to-landfill product. As soon as it stops being profitable for mining it will have zero resale value, and become e-waste.
Like literally every other piece of Bitcoin or crypto mining technology. It’s straight to the landfill every 6 months.
Bitcoin generates 100 GRAMS of ewaste for every transaction. Every 2 transactions consumes as much energy as driving a Tesla from SF to NY [edit for clarity: round trip] and as much ewaste as throwing your phone out the window along the way.
Don’t hate the player; hate the game. Nvidia is maximizing shareholder value as per their charter, creating differentiated product offerings for different segments. Quadro, RTX, CMP. It’s all segmentation, and better, likely binning. I wonder if CMP products are just failed RTX and Quadro parts they’re offloading. For every card Nvidia doesn’t sell, Ant will sell 2 ASIC miners.
Time to push back on proof of waste, and end the economic incentive.
I checked your numbers against [0]. Road distance NY -> SF = 2900 miles [1]. Model S mileage is 290Wh/mi [2]. That is 840kW for Tesla against 657kW for a single transaction. This is all order-of-magnitude stuff but... wow.
Yeah the distance thing was ball parked, based on the last time I ran the numbers. To your point, I accidentally omitted to say “round trip from SF to NY” - I also used the great circle distance (2586mi) and ballparked a phone at 200g, roughly the weight of an iPhone.
Thank you for going through the numbers and adding the sources. I was on my phone and meant to circle back.
Can someone explain how transactions are not insanely cost prohibitive for Bitcoin with power consumption figures like those? Those numbers would suggest something like >20 USD. Is it because most transactions in practice actually happen internally on exchanges that avoid putting every individual transaction on the blockchain?
I have limited knowledge of Bitcoin so maybe not quite "explain like I'm five"... but close. :)
The whole point of Bitcoin mining is to produce a block, which contains a bunch of transactions. The Bitcoin network dictates that each block's SHA256 hash starts with a certain number of zeroes, so the only way to achieve this is to brute force the block data until you find the "winning" block which hashes to a number of zeroes.
Once someone finds a winning block, they're rewarded with a number of Bitcoin. This subsidies the cost that goes into mining that block. However the reward halves after every 210k blocks, so as the reward goes down, miners will prefer to only include transactions with high fee. Eventually the true cost of mining will reflect in the transaction fee.
And to add to the "it costs 657.6kWh to process a transaction": energy is used to produce a block, which contains an arbitrarily defined number of transactions. Right now, Bitcoin Core limits each block to 1MB, which works out to about 2k transactions per block. If Bitcoin Core were to increase the limit to say 10MB, the energy used to produce a block doesn't change, but the energy used to process a transaction goes down tenfold.
> And to add to the "it costs 657.6kWh to process a transaction": energy is used to produce a block, which contains an arbitrarily defined number of transactions. Right now, Bitcoin Core limits each block to 1MB, which works out to about 2k transactions per block. If Bitcoin Core were to increase the limit to say 10MB, the energy used to produce a block doesn't change, but the energy used to process a transaction goes down tenfold.
Oh, it could be somewhat improved. Then it'd be "only" 65 kilowatt hours, compared to ~1 watt-hour for conventional payment networks.
(And, this assumes that the increase in bitcoin price doesn't cause more mining).
>Then it'd be "only" 65 kilowatt hours, compared to ~1 watt-hour for conventional payment networks.
Sure, and? You get other benefits, those might not be relevant for you in which case keep using conventional payment networks. But if things like counterparty risk of your payment processor come into your calculations, even $10 dollars a transaction migth be a good deal.
The entire point of this subthread you're responding to is that the cost of transactions is so energy intensive that those "other benefits" are null and void.
It's called an externality. It doesn't matter if you'd pay $10 for it if the cost imposed is too great. There are a lot of things people would absolutely pay for that we don't allow, because of that. Dumb weird fantasies about decentralization or counterparty risk or whatever don't actually matter if the energy cost is that high. It's just not important enough. Sorry.
> It's called an externality. It doesn't matter if you'd pay $10 for it if the cost imposed is too great
The cost imposed is proportional to the fee, since the fee has to pay for the electricity used to mine the transaction. And $10 of electricity is just... not a lot to worry about.
If you believe the electricity consumption in general has huge externalities, you have bigger problems than mining, and once you address those the mining cost/diffiuclty will adjust
> The cost imposed is proportional to the fee, since the fee has to pay for the electricity used to mine the transaction.
No. Miners get rewards for mining blocks, which pay for 80%. Transaction fees don't pay for the electricity used in a transaction: not even close.
> If you believe the electricity consumption in general has huge externalities
Well, sure, I believe that electricity is artificially cheap. I also believe that spending $50+ of electricity per transaction done is absolutely nuts. And that amount is steadily increasing...
Hm, OK, I was wrong about the ratio of fees to reward. Good news is that reward is going to halve every 4 years so in 12 years it's probably going to be much more even.
Sorry, but even if we significantly improve things such that a transaction "only" takes the equivalent of a few days of my electricity use.. that's not so great. There's better ways to do this that don't require mining.
Indeed, from a decentralization perspective, mining is looking worse than proof of stake, etc.
That makes a lot more sense that many transactions are included within a block. So effective transaction fees are likely in the few cents USD range.
I've read that one issue with the latency of Bitcoin transactions is that many of the large miners from China have network issues. Would raising the block size to 10-100MB cause significant latency issues for transactions even if one could fit a lot more transactions into a single block?
> Would raising the block size to 10-100MB cause significant latency issues for transactions even if one could fit a lot more transactions into a single block?
There are two ways to understand your question so I'll just answer both of them:
Fitting more transactions in a block leads to faster confirmation, and potentially lower fee. When a transaction gets into a block, the network confirms that the transaction is valid and can't be reversed. Right now each block can only fit ~2k transactions, and since a block can only be generated every 10 minutes, the mempool (backlog of unconfirmed transaction) is rather huge. This site shows how big it is: https://jochen-hoenicke.de/queue/. Miners are incentivized to only select transaction with high fee into the next block, which in turn incentivizes users to pay more fee to get their transaction to confirm faster. With bigger blocks, more transactions get confirmed quicker, which leads to lower fee.
Bigger blocks however mean miners with slow internet connection are at a disadvantage. When a miner finds a winning block, they broadcast the block to as many nodes in the network as possible. After a while, a majority of nodes accepts that winning block, and the miner is eligible for the mining reward. If A finds the winning block 10 seconds after B, but A is able to propagate their block quicker than B, then it's possible that A's block is accepted and not B'. In reality, it's possible that B might still win the race, but the rule is: faster propagation => more chance of being accepted.
The cost is socialized across the block reward. So long as there’s more new money coming in than block reward paying electric bills on the way out, the cost of a transaction is socialized efficiently.
Elon’s $1.5B investment only lasted a total of 4 weeks. It’s already gone. It’s in the hands of Chinese coal produces now.
That is roughly what transaction costs are. In some camps this prohibits the technology ever becoming a widespread method of payment. If we’re going for decentralised currency and 10 exchanges control all the low fee movement stuff...
«Can someone explain how transactions are not insanely cost prohibitive»
Because the comments above yours are misleading. Transactions don't consume mining energy. Miners expend the same amount of energy regardless if they are validating 1 or 1000 transactions in a block.
That’s some high class mental gymnastics though. Because if you divide the energy spent across the number of transactions you land back where you started. And if the energy was burned for 0 transaction capacity then the underlying would be worthless. Secure but worthless. So it doesn’t take much inference to realize that the value is in transactbility.
You’ll have to explain to me why when mining a block of transactions, it doesn’t make sense to break that down on a per transaction basis with division.
If the number of transactions in a block ever changes I’ll change my divisor. Until then the proof is on you isn’t it?
«it doesn’t make sense to break that down on a per transaction basis with division»
Because the division result is meaningless as it just reflects how full a block was, not the cost of a transaction. Again: miners expend the same amount of energy regardless if they are validating 1 or 1000 transactions in a block. Exactly the same amount. And whatever math you might be doing doesn't account for batched transactions (1 tx, N outputs), lighting transactions, or off-chain transaction (think tx within a platform like an exchange).
> Again: miners expend the same amount of energy regardless if they are validating 1 or 1000 transactions in a block.
For one that’s horrifyingly wasteful - you say it like it’s a good thing but it’s really not. And two, I’m describing the network as it exists today. That’s not dishonest; there’s no plan to increase the quantity of transactions per block. If it ever changes we can run the numbers again. But saying it’s a bad way to run the numbers is like saying there’s no cap on the number of bitcoins because the core team could just change the cap. Ok, and if they do, we’ll run the math again.
«For one that’s horrifyingly wasteful - you say it like it’s a good thing but it’s really not. »
It is a great thing: it means Bitcoin can scale without increasing energy consumption. See, that's my problem with the way you phrased your post. You misrepresent the system. You imply transactions consume energy when they don't.
> It is a great thing: it means Bitcoin can scale without increasing energy consumption. See, that's my problem with the way you phrased your post. You misrepresent the system. You imply transactions consume energy when they don't.
You pretend they don’t but they obviously do! You’re green washing one of the biggest environmental tragedies since CFCs.
It could in theory scale past where it’s at but the core team won’t let it making your argument no more useful than my unlimited coins argument. In what way exactly is it different? Yes if we pretend it’s efficient, it is. But only in our imaginations.
Even if they did increase block size fundamental inefficiencies mean even at maximum scaling it can’t hold a candle to visa or even a raspberry pi running MySQL.
Not really. It means Bitcoin _currently_ scales negatively on market cap rather than transactions. Considering that bitcoin seems to be advertised primarily as a store of value and that there is little effort to push adoption for transactions, that is a very bad thing : it means that, in case of success, bitcoin performance per transaction will become worse.
In the long term, if transactions count doesn't rise fast, we will also see significant issues, either with transaction fees going to absurd amounts, effectively banning btc as a payment medium, or with the miner pool decreasing to counts where decentralization becomes inexistent.
> Miners expend the same amount of energy regardless if they are validating 1 or 1000 transactions in a block.
This ignores the transaction fees. Even though they are only 10-20% of the total fees a miner collects, 10-20% of a round-trip from SF to NYC is still a lot of energy.
So essentially it only gets worse from these numbers, because right now every block is full so every transaction is consuming the least amount of energy it can if you process less transactions the energy cost just keeps going up.
Also the defense of "the energy is spent securing the blockchain" is missing the point, the blockchain isn't a good unto itself it's only useful as a log of transactions and you only need new blocks when there's transactions and if there are no transactions Bitcoin is basically dead.
kWh, if we're going to be pedantic, which I approve of. :)
It bugs me that even quality newspapers frequently mistake power (watt (W)) for energy (joules (J) or watt-hour (Wh, often with the usual metric prefixes)).
Fuck that. I can hate the game, and the players who decide to play it.
There's zero chance of ending the economic incentives of cryptocurrency mining without (global) government intervention. Instead, the best effort the US government has provided is increasing SUPPORT for it.
Apple took a unique, aggressive, and still-controversial stance for privacy. This position has easily cost them a double-digit percent of their revenue; the amount of money they could generate from selling out their billion+ users, who are generally above-average income, in the same vein Google does, is unfathomable. I don't like Apple as a company, but I still feel its important to give them recognition for how industry-leading they were, and still are, in pushing for Privacy.
There are ethics which can transcend corporate interest, even when considering negative PR. There are some things companies just don't do. The excuse of "optimizing shareholder profit" and "hate the game" is bullshit; it moves the onus of responsibility off of companies who are fully capable of acting more ethically by blaming some Invisible Hand as if they had no choice in the matter.
That's bullshit. Jensen Huang had a choice. He made it. End of story. You want to shift some blame onto Ant and the miners themselves; that's all fair game for Round 2. But at this time, we're still on Round 1.
Its reasonable for Nvidia to stand by and do nothing. Graphics cards are, after all, general purpose computing hardware. Even after they've outlived a useful life in a mining farm, those miners may attempt to extract an ROI by reselling them in secondary markets. Not ideal, but certainly better than Nvidia's plan for the future.
Nvidia's plan is to, instead, support miners directly, while intentionally tiering consumer hardware, then positioning the entire move as somehow "Standing By Gamers". A dystopian science-fiction author legitimately could not write a more canonically evil storyline. I would change this position if Nvidia walks back, cancels the CMP line, and continues forward with driver limitations on the RTX 3060 with public plans to move to in-silicon ethereum mining protection on future cards; I think that's reasonable, given its fair to assume that the "ship has already sailed" on Ampere. Until then, fuck Nvidia; I'm not touching anything their hands are involved in.
Given the energy requirements and how it's kind of morphed into this beast of greed, I'm inclined to agree....but how to pull this off?
Fiat bans cannot really work well against just doing math. Taxation schemes/financial regs tamping down on bitcoin speculation, what you can spend it on? By design I don't think you can just pump money into bitcoin in order to "deflate" its value without similarly consuming a lot of energy?
Perhaps a simple ban on converting fiat -> to bitcoin would go a long way? It wouldn't be absolute, black markets are a thing, but there are costs to participate in a black market, which costs will rise over time as good old physical cash declines.
How would you execute an exchange ban politically? Any politicians who did this would be responsible for instantly vaporizing tons of value owned by an increasingly non negligible fraction of the population that spans a broad spectrum of political affiliation.
An exchange ban also wouldn't stop bitcoin, because the network would still continue running and anyone cutoff and stuck with bitcoin can still use it as a medium of exchange, and it could kick off its development as a direct unit of account. An exchange ban might even make using bitcoin as currency even easier, because if it's not legally exchangeable for fiat, how do you tax cap gains on it?
Stopping bitcoin is a REALLY hard problem. It would require a massive coordinated crack down by every government on the planet. And then you're left with a prisoner's dilemma. What happens if China or Russia publicly go along with such a ban, but secretly keep supporting it and buy up coins for themselves, preparing for a day when it will be the global reserve currency that dethrones the dollar? It's a similar game theory problem to why we will likely never live in a world without nuclear weapons.
The reason this will probably never happen is because every government will think through this game theory, and realize the risk of being on the wrong side of a potentially ineffective ban and hamstringing your nation's citizens and companies from participating and developing on the new global financial system, is greater than the cost of just capitulating and adapting to live with it.
There are more people who already own some bitcoin than you'd probably want to believe. Many of them believe bitcoin is on balance a good thing.
There are at least a couple of bitcoiners currently in Congress, including the new Senator of Wyoming who is publicly all aboard the idea that bitcoin is an important, long term store of value that the US needs to jump on board with.
How many politicians do you think already hold some bitcoin quietly, just in case?
This comment is in clear violation of the site's rules. Please don't be deliberately rude to others here and try to interpret comments in good faith rather than as some sort of veiled attack.
>> It is disgraceful to watch people such as Elon Musk and most of the VCs put money ahead of the environment.
> I am sorry you did not buy it when it was cheaper. No need to hate it now because you were stupid back then.
You're obviously defensively making some false assumptions. The big one is probably assuming that people are primarily motivated by greed and jealousy, and that other motivations are fake and meant to disguise that.
And for the record, I agree with the GP, and I've made quite a bit of money off of this bubble by selling some bitcoins I mined a decade ago.
Isn't that a trend that has mostly ended or is about to end now? It looks like that was mostly caused by nm process improvements for ASIC chips in the past. With 7/5nm ASIC chips it seems like it should now progress as fast/slowly as the nm process capabilities of chip manufacturers in general.
Not pro-BTC, but this seem very much like faulty interpolation based on a past trend.
Technology continues to advance - I don’t think there is enough evidence to say that we have hit a fundamental limit on computer/asic power/speed performance yet.
3nm will hit in 2024, but until then there will always be advances in chip design. It’s not just about speed, it’s also about thermals, power consumption and manufacture price to work out the TCO of mining.
> 3nm will hit in 2024, but until then there will always be advances in chip design.
I'm not an expert on chip design, but I would have assumed that there isn't a lot of room for improvements of ASIC mining chips, given that they are probably a straightforward implementation of the hashing algo used in Bitcoin (and that a few thousand+ times), in contrast to CPU design which is more dependent on other computing/hardware trends.
Well even without chip design, for instance if you decrease unit cost, that also has the impact of changing the mining landscape because many more ASIC miners can enter the game.
Also remember an ASIC miner will generally use more than it costs in electricity each year, so a relatively small improvement in hashes / watt makes a big difference to overall profitability.
CPU design is more complex, but also has more people working in the space and has been going for longer. I just get sceptical if someone says this is as good as it is going to get because history shows usually those predictions end up being wrong. As a child I thought Super Mario 64 was as good as computer graphics were ever going to get...
That doesn't diminish the hashing power of the equipment you already bought though (and the topic was obsolete equipment).
> so a relatively small improvement in hashes / watt makes a big difference to overall profitability
Yeah, that's a fair point.
> I just get sceptical if someone says this is as good as it is going to get
And I'm skeptical if people blindly extrapolate past trends, assuming that they will continue to hold. I wasn't saying that it won't get better from here on out, just that the extreme growth that was mostly fueled by quickly going through the existing nm process steps can't be sustained, and that from now on it will follow the same slowish pace as all chip miniaturization.
> That doesn't diminish the hashing power of the equipment you already bought though (and the topic was obsolete equipment).
It means even more equipment to go obsolete when improvements come through - and more equipment deployed does not equal more bitcoins generated, so it is an issue of more hardware that will ultimately end up in landfill / being scrapped.
> And I'm skeptical if people blindly extrapolate past trends
Well I'm also sceptical of people who blindly assume progress won't be made. To quote someone in 2014 predicting what would happen to ASIC design...
> As an emerging field of IC design, bitcoin mining ASICs have experienced rapid evolution over the past two years. However, they cannot keep evolving and developing at the current rate.
> Like literally every other piece of Bitcoin or crypto mining technology.
I don't get this comparison because Nvidia isn't a Bitcoin mining technology company. Or at least they weren't until a week ago. When someone deems it not profitable to mine on a set of 50 GPUs they purchased, they can resell those to gamers. But these mining cards are just e-waste.
Of course they do. Just because a game isn't being played doesn't mean it doesn't exist. Or at least, the rules do.
In a world of 7 billion humans, all you need is 2 to be the first movers and start playing. If the game is rewarding to the players, then it will continue to attract more players and grow until the size of the network of players itself becomes a huge part of the value of playing.
Games are just a protocol. And if the protocol is useful and better than other protocols, or has some first/second mover advantage, then the network using the protocol will grow until it saturates the addressable nodes.
Lots of professional mining operations will actually undervolt the cards to get them to run more effeciently, decreasing electricity and cooling costs overall. Also there's less thermal stress due to them running constantly, so you don't get the physical stresses of fluctuation between hot/cold cycles.
Assuming the crypto bubble pops catastrophically and never recovers, yes we'll be left with a lot of worthless ASICs and GPUs, but also a ton of newly freed energy sources and chip manufacturing facilities, the development of which will have been funded or subsidized by the crypto speculative craze.
A world with more access to energy and cheaper silicon chips is probably necessary to address the immense challenges civilization is facing.
I have an exactly opposite experience - already got a second card from friends using them for mining when it was not good enough for them. Good price and both still working just fine for gaming. :)
Just give miners a 10 year tax break for using clean energy and they'll build a whole new green powergrid. At no cost to you, taxpayers, or anyone else. But no! Let's dispense with any rationality and just BAN TECHNOLOGY.
It's rather strange to see so many luddites on HN advocating to ban technology they dislike.
First, they came for the miners. You'll be cancelled someday too.
On the other hand if you were a person who needed to make a transaction in person because you can’t use banks for that (I.e. mafia) you would take that ride and probably spend even more energy commuting several people (arms, team of people) for a sufficiently large transaction.
In the Wild West you would need maybe several horses each in order to exchange a certain amount of gold. What does the carbon usage be here?
Bitcoin solves that problem in software, which is magnificent. For the same carbon footprint.
The link directly contradicts this. Pooled mining with BTC (ETH too, but also BTC) on a GPU is something you can do and make a modest amount of money right now, to the tune of around $5/day at current prices.
There's a profitability floor on hashes/KWh. When coin's price goes as sky-high as it has been, it pushed GPU mining back into the black. This is likely a temporary state of affairs; either the network difficulty will increase to where only ASICs can break through, or the price will decline.
The link shows paying in btc because they don’t give you what you mine. You rent your hash rate and people bid on it with btc. You’re not mining btc directly. I use NiceHash at the moment. Your gpu mines ethereum (or whatever the most profitable coin is as NiceHash has a profit switcher.)
Could you explain to me why someone would offer their hashing power on NiceHash vs. just directly working for a pool?
The main cases I image are:
1. The miner doesn't want to deal with price fluctuations and wants to prevent arbitrage risk
2. It might be more profitable to outsource smallish coins that have to buy hashing power to stabilize their network (though that would probably also be covered by the profit switcher)
3. Money laundering <--- Which I assume is the biggest portion
Weirdly I can somewhat answer this as I’ve just got into NiceHash since this Friday night, mainly as a learning experience more than a money making endeavour. I have a load of my own company PCs with GPUs like 1080s and 2070s we used to rent for VR usage at events that have basically been sitting in boxes, idle for the last 12 months due to COVID. No idea how I ended up down the mining rabbit hole on a Friday night, maybe a comment here...
Anyway, I now have six machines set up in my (previously... freezing cold garage), generating around £20 to £25 day total, after electricity costs. I’m on a metered power supply that is much cheaper at nighttime.
The benefits of NiceHash are:
- super simple set up. Literally an installer, it benchmarks the machine and gets to work in minutes. They have their own OS as well. That’s a rabbit hole for a few evenings this week to have some fun with.
- mobile app for monitoring all your rigs - temperatures, profitability, etc.
- pretty decent web front end for tracking progress, crypto prices, rig stats, etc.
- automatic switching between the most profitable algorithms in real-time. If the market are paying money for a specific currency/algo, it will swap to where the money can be made quickly and automatically without you needing to be involved.
- easy to disable specific algorithms that don’t work well. First day it kept jumping to KawPow, but for me this was really poor for profitability on my hardware, so it’s literally a click to disable that algorithm in their software.
I guess it just works, is largely set and forget, but offers lots of tweaking, monitoring for those with the desire to do that. Is it the most efficient or profitable? I’m starting to think not, but I’m still at the bottom of the curve for all of this, however, NiceHash has provided the perfect springboard into a new (to me) world.
BTW, I am fairly anti crypto in general. I’m kind of doing this as a learning experience while I have some free hours to burn each week on learning about something I feel I should know a lot more about while having a bit of fun in the process.
Thanks for explaining! I would have assumed that there are some cross-coin mining pools out there that have a similarly good UX, where you would end up with the mined coins directly, but it kind of makes sense that it exists on that abstraction level.
There's other mining software and native OS's out there, like minerstat that you can seperately link to NiceHash or other pools, that abstract it all a level further, swapping between pools based on profitablity and probably over a year offer a better pay out, but I've not started diving into that yet. Minerstat charge 1.65eu per worker/per month after the first rig, but seem to offer a lot more options/flexibility (more stats, more pools, etc.).
I honestly don't understand who's buying the hash power on the other side. I think people start with Nicehash because it's PnP and easy to use. From there you can look at Minerstat which mines directly on pools and offers features like triggers and fancy graphs.
Setting up a miner yourself doesn't take much work but it won't have profit switching or extra cloud features that Nicehash or Minerstat have.
I am sick and tired of this bullshit logic, about how much energy per transaction Bitcoin uses. Bitcoin's blocksize is being forcibly kept low. It can easily be increased 100 MB, if not more, without causing any serious issues. That will increase the number of transactions per second to a thousand. Plus, comparing a Bitcoin transaction to a traditional transaction is like comparing apples and oranges. A Bitcoin transaction is completely different as it is irreversible. That energy consumption is the price we pay to make sure our transactions remain irreversible.
Imagine the PlayStation was able to be used to mine crypto. Game developers depend on gamers to actually play their games, what would you have Sony do? Encourage it or do nothing? Pretty sure their obligations to any shareholders would be to limit it so more gamers can play and buy games for it.
Not trying to pick on your comment but I'm always frustrated when someone points out "obligations to shareholders". I wish people would push back on this more. Sony's obligation when building and selling the PlayStation is to the end user customer. Not the shareholder. Because if Sony has only the shareholder in mind then a gaming console is probably the wrong market to be in from a pure profit perspective. Sony's obligation is to the gamer who wants a competitive, performant console. If they put out a next-gen console that has no games to play nobody will buy it and end up with no profits. Organizations have it backwards if they think they should be catering to their shareholders. Shareholders expectations should be that Sony has expertise in the field of consoles and I'm investing in Sony with the understanding that I want to invest in that. When I buy Sony stock I don't think Sony has bringing me a few more points in profits first over doing the right thing and building the best product for the end user. I hope that my investing in them allows them to better continue to innovate and bring compelling products to the market.
Milton Friedman's doctrine was a failed turning point in business thinking [0] and unfortunately pervasive in people's perspective today. Prominent shareholders often have different motives than that of delighting the customer. Motives that are very short term profit driven. Such a toxic way to purport value.
Thanks. There should be such a comment every time someone mentions “but they have to maximise profit” and “think of the shareholder”. This point of view is not rooted in any law or legal principle and is just the Wall Street version of “might makes right”.
Yeah I think people are growing tired of this attitude. If you want to be a part of society and be a "person" in the eyes of the law you have to be a good citizen as well. Doctors have to follow a credo in addition to making $$ as do lawyers, engineers, etc. Thus it's time that if company's want to be legal "persons" they need to accept some responsibility for long term viability of their place in society beyond "shareholder value". I really hope that is a philosophy that more people take to heart.
Of course it is rooted in law, natural law. Imagine a company makes a decent profit and then decides its goal is not to make so much profit anymore? What do you think would happen? I would vouch for Parkinsons Law kicking in instantly. This would just end up in a waste of resources and misery overall
I think I see what you mean, but natural law is something else entirely. Under that framework, maximising profit for the few at the expense of the rest of society is unjust.
We could use a revival of these ideas (which were the foundation of the enlightenment) in our era of blasé cynicism.
Then, if I understood, your point goes back to “might makes right”, as in, we have the power, we might as well take the money, and to hell with long term prospects or society. It is unfortunate that these people have so much influence over the rest of us. It is also a demonstration that optimising private interests does not result in public good, that is, the invisible hand of the market is utter bollocks.
I read this "Motives that are very short term profit driven." everywhere but never came to an explanation why that would be the case (something above anecdotal level). In my understanding DCF is just the natural way to put it. Maybe the discounting rate can depend a bit on the shareholder, but in large the investment horizon of most players should be quite far, given the current "risk-free" return.
"Running on more than 1,700 PS3s that were connected by five miles of wire, the Condor Cluster was huge, dwarfing Khanna’s project, and it used to process images from surveillance drones. During its heyday, it was the 35th fastest supercomputer in the world."*
Back at PS2/PS3 they were used for computing, Sony actually released a kit and had paid staff helping people who wanted to build PS2 supercomputers -- a good friend from college worked on the project at Sony. They had amazing vector processing units in them.
Sure, but this still only consumed a tiny fraction of the Playstations in production at the time, rather than ~all of them. There are only so many HPC labs in the world (and an HPC lab using PS3s is a font of good press coverage for Sony!) while there are as many crypto-miners as the market will bear (and they generate only negative press coverage on top.)
They are obligated to shareholders. But i am not a shareholder. I am a gamer who wants to buy a new graphics card. So i am going to scream, boycot and lobby government against them until the needs of thier shareholders bend a little closer to those of thier consumers. Free speech is a necessary part of a free market.
Maybe nvidia should be held responsible for thier products. Anyone else up to support a law imposing disposal obligations on single-use products designed to kill secondary markets? When the boom slows, the miners can give these cards back to nvidia for environmentally friendly recycling.
> So i am going to scream, boycot and lobby government against them until the needs of thier shareholders bend a little closer to those of thier consumers
This line of thinking scares me. It’s basically saying “they aren’t doing what I want so let’s use the government to make them do what I want!” How is this freedom? Forcing people to bend to your will though government is the opposite of freedom, and should be used very sparingly
Unlike certain politicians, i do not consider publicly traded corporations to be "people". I dont really care about thier feelings, even thier rights. Corporations are a legal fiction created for a specific purpose, not persons with emotions.
> i do not consider publicly traded corporations to be "people". I dont really care about thier feelings, even thier rights.
Whatever helps you sleep at night I guess.
The fact is that corporations are made up of many people, so at a basic level you’re still compelling people who want to do what _they_, to do what _you_ want, through government.
But it’s pretty clear that you and I are never going to agree on this. So probably not worth discussing too much further.
Why is Free Speech necessary to a Free Market? I thought a consumer choosing to spend their dollars was the "speech" in a Free Market environment. And when you lobby the government, you've kind of thrown out the Free Market paradigm.
Free speech is also product review, such as LTTs recent video on this subject.
Regulation of a market by government doesn't mean abandoning free markets. Regulation is an essential part of sustaining a free market. Totally non-regulated markets quickly devolve into monopoly and other anti-consumer evils such as devices, like this, designed to kill secondary markets.
> Regulation of a market by government doesn't mean abandoning free markets.
"Regulation" in the old sense of "making regular", perhaps. Establishing fair weights and measures, preventing fraud, etc. Ensuring that individuals are free to trade and that all transactions are in fact voluntary (including, most importantly, transactions with the government itself). But when your "regulation" consists of applying force to micromanage how people—shareholders in this case—employ their capital for the benefit of some other group, you've moved well outside the realm of the free market.
>Pretty sure their obligations to any shareholders would be to limit it so more gamers can play and buy games for it.
It's not really the same because playstations are sold at break-even or at a loss with the expectation that follow-up revenue would make up for it. On the other hand no such dynamic exists for nvidia. For them, sale to a miner is the same as a sale to gamer.
This may look true from the perspective of a single sale, but in the long term this just doesn't hold up. NVIDIA builds a strong and long-held reputation with gamers, and are a well renowned name in the field. If they made cards for miners at the detriment of gamers, they risk losing that valuable market position.
This is a common sentiment I see on discussion forums, but I think it's wishful thinking at best. Most consumers aren't enthusiasts that follow these types of news, and even the ones who do probably won't care/remember a few years down the line. After all, nvidia seems to be doing just fine despite their long history of anti-consumer practices over the years.
I think they run the risk of people giving up on pc gaming completely or never getting into it in the first place. A market that has and should be growing will shrink instead.
Which anti-consumer practices? That is, given that we're talking about gamers.
There's Nvidia's longstanding allergy to open source, but that only affects the tiny fraction of gamers who use Linux.
There's this hash rate thing, but from a gamer's perspective that's pro-consumer.
I suppose you can count their price increases for recent GPU generations. But I don't know if that rises to the level of "anti-consumer". Especially when they've been delivering performance to match.
The best example I can think of is their policy of requiring servers to use their more-expensive 'professional' line of GPUs. This does hurt gamers, since it forces game streaming services using Nvidia cards to charge higher prices. That said, among the major streaming services, Stadia, xCloud, and PlayStation Now all use AMD GPUs, while GeForce Now can skirt the policy since it belongs to Nvidia itself. The remaining services are relatively obscure – though perhaps they'd be less obscure if Nvidia didn't have that policy.
Would be true if there is healthy competition. And if a significant percentage of potential buyers would know about it and thus avoid NVIDIA. However in reality most customers wont know and they still get the GPU its just expensive which they most likely see as a result of the pandemic.
Mostly software subscriptions, actually.. things like PS Plus, etc. On consoles, you need to pay money (subscription) to be able to play multi-player for instance (whereas on PC it's free), and games in general are more expensive compared to PC (last time I checked).
To my knowledge they didn't shut it down, PS4 architecture moved to standard x86 PC so no longer advanced vector units available through PlayStation that you couldn't get separately in an easier to work with way. This led to the rise of the current Top 500 which are a mix of CPU and GPU.
I would have had Sony and friends not create a closed ecosystem in the first place.
You're replying in a thread about (compared to consoles) general purpose computers, which developers depend on gamers actually playing their games on, being effectively DRMed.
I don't particularly care about shareholder obligations in this case. I'm glad this incentive structure exists, and it doesn't necessarily inform my ethos.
This is being received well by people who are practically triggered by the notion of cryptocurrency operations.
At least some consoles are sold at a loss to keep the entry cost within an acceptable range and expect to recoup that from game sales. One PlayStation generation came with a somewhat crippled other OS feature. By the time Sony killed the feature people already build quite a few PlayStation powered compute clusters.
Yeah, imagine if this applied to trucks. No one could use one for non truck bed usage. Ford disables ignition if it detects the bed is empty after X concurrent uses or miles.
Trucks generally are often wasted (from a green perspective) because people like them not for their intended design utility.
I think it’s important to note that the 3060 is not for sale yet. Nvidia is telling everyone up front that this is how it’s going to work.
The analogy is improved like this: Ford announces in advance that their upcoming truck requires a certain usage, and if you want whatever special commercial usage, you need to pay a premium. You might be grumpy about the upcharge, but this is a very different situation from them changing the terms of the deal after the sale.
Yeah I thought about commercial usage. And they do generally need a specific license types. Eg I can’t drive a dump truck. But then I started thinking the analogous Tesla v100 or something which is higher priced for commercial application. Wouldn’t it be fair to say nvidia makes the 3060-3090 for gamers and not for commercial rendering farms. So why is using a gaming gpu for autocad really any different than using it for mining.
Implicit somewhere is the economics and supply is that nvidia has made these ans knows people use them for work. What would be our collective thinking if they made the gpu not work for non games?
This would probably increase supply further for the gamers. Commercial users should and would pay more since they are literally profiting from their use.
The key difference is a manufacturer exerting control over what you're allowed to do with property that is supposedly "yours". That the terms were announced ahead of time will mean little once this becomes commonplace.
We often have used GPU's for around the price range you're looking for. They won't be fantastic, but probably an upgrade for a 9 year old machine.
Edit: I should also add, a lot of the really small scale shops also don't list 100% of their inventory online, with the benefit that it doesn't get scalped up like everything else.
We must have different definitions of "essential". I'm curious which jobs you're referring to, because to my understanding, every aspect of society that I consider essential, has existed since before the advent of GPUs.
> every aspect of society that I consider essential, has existed since before the advent of GPUs
Modern computers have allowed these 'forever existing' technologies to scale to our population growth. Try building freeways without modern computers, or performing modern medical interventions without them.
CAD doesn't really use GPU power, only a basic viewport that any integrated potato can render. Related simulation stuff (CFD, FEM) can be done with GPU compute but it seems that CPU is usually good enough.
Probably the most critical usage of GPUs is scientific simulations for biology/medicine, climate, space, etc.
I appreciate this move to make GPUs more available for non-crypto users. As someone who wants to buy an AMD GPU to run my Ryzen machine, I have been waiting 2 weeks because there is no supply of GPUs at the RRP price. I can only buy the 2 year old models for double the price they sold for.
We have a global chip shortage. NVidia isn't magically pulling more GPUs out of their asses. Yes as they claim some chips might be defective in a way that makes them unsuitable for a graphics card but still work as a mining chip, but if you believe that they won't redirect perfectly fine chips into mining card production if demand permits it, just because they care so much about gamers and not their shareholders, I got bad news for you.
Mining demand is volatile and will use whatever gives them best hashing/$ ratio - pissing miners off might lose you some cash short term but gamers are much more susceptible to branding, and not to mention devs optimise for popular hardware - if AMD suddenly became a major market % holder in the gaming market because NV shipped their silicon to miners, once mining dips down they get slammed. And if they are hoping production will ramp up to match demand some time in the future they still want to have a winning product.
How long until gpgpu isn't cost effective for mining? I know a lot of fpgas went on ebay for cheap when the new xilinx line came out. The watt per hash has to be lower for fpgas already. Will the cost of a coin always reflect the current price to make a new one and thus gpgpu will always be a cost effective technique?
I think right now biggest earner for GPU mining is Ethereum. And (I think) it's aiming to move to Proof of Stake sometime in next 12-24 months, which kinda time limits the utility of both Eth-specific mining chips, and large investments in GPU mining (unless mining burns GPUs out really fast?).
Regardless, most proof of work algorithms are already being optimized with ASICs (ala bitcoin)... Or, like Ethereum's ETHASH alg, they have a step requiring high throughput random access to large amounts of memory, ala scrypt or argon (5gb or so for ETHASH right now, and slowly increasing). That's what's kept ASICs from being profitable for Eth, and is probably limiting factor for FPGAs as well. Though maybe there are FPAGs with dedicated memory on par with a modern GPU?
Proof of stake is the fusion power of crypto. It's been 6-24 months away since what, 2016? I toured Consensys a few years back and at the time it seemed PoS was right around the corner.
OTOH, it could be cracked tomorrow and value of dedicated silicon will go off a cliff. So I don't blame the GPGPU crowd for that, before even considering the engineering challenges of silicon/FPGA for eth.
Ethereum's Proof of Stake "beacon chain" is already live, with around $6 billion staked (since launch in december 2020). Next year is just formalizing and testing it taking over from the miners. So despite timeframe, there's little uncertainty about the outcome.
A question out of genuine curiosity: Doesn't the nonce puzzle difficulty adjust to the amount of compute the network has, making it a zero sum game? If every mining pool buys 20% extra GPUs, isn't everyone soon back where they started, wasting hundreds of millions in the process?
I'm not an expert at the mechanism design, but yeah, that's my understanding. The network's goal is to have a block every X seconds; and the difficulty periodically adjusts if blocks come in too fast / too slow (due to hashpower coming on/off line).
(Ethereum & Bitcoin both add some other year-decade timeframe factors to the difficulty, but don't think it affects this basic principle, which acts on week-to-minute timeframes).
I view it as a race to the margins. Anyone who can do it cheaper, or benefit from economy of scale, will get a larger slice of the pie, but at smaller margins... creating a cycle of consolidation.
I think original assumption was that folks running PoW at home on their GPUs would be able to compete, but due to efficiencies of specialized hardware and regional differences in electricity costs, that's just not the case.
---
Others might disagree, but my personal opinion is that this isn't a sustainable way to maintain decentralization, since it seems to obviously trend towards a few large "just turning a profit" players.
There might be some ways to adjust mining incentives to make it work, but this fundmental issue is why I think Proof of Stake has a much more viable future, as it sidesteps this (and the environmental) issues.
> If every mining pool buys 20% extra GPUs, isn't everyone soon back where they started, wasting hundreds of millions in the process?
In theory you could run the entire network on one CPU, though it would take a very long time for the difficulty to adjust downward by that much from its current level. However, the extra capacity isn't wasted—a network secured by the hashing power of a single CPU would be extremely vulnerable since a trivial expenditure would give an attacker >99% of the hashing power and the ability to create forks and double-spend almost at will. The point of spending so much on hashing is to ensure that no attacker could plausibly out-spend the rest of the network and thus acquire a majority of the hashing power.
Now, do we need all of the current hashing power to secure the network? I don't think so. That's a consequence of the initial block distribution system, with a fixed halving schedule for block rewards and a price which has risen much more quickly than anyone anticipated. The cost of reliably executing a double-spend via a 51% attack far exceeds the potential reward for pulling off the attack. At some point the market will be saturated, the price will cease to double every four years (or less), transaction fees will make up a majority of the block reward, and the resulting drop in the total market value of the block rewards will lead to a decrease in hash rate to a level which is sustainable and yet still sufficient to discourage any would-be attackers.
So nvidia is releasing a line of crippled cards that will mostly only be used for one year amd then trashed? That is horrible! Glad it may be over soon I guess.
I sorta see this as Nvidia trying to create some market segmentation not for the miners, but so gamers can actually get their cards before Nvidia loses mindshare. (Not that I think that's likely)
1) covid-19. Factory closures, employee, and supply chain issues.
2) 30-40% growth in gaming due to covid-19
3) trump tariffs (for the USA). Gpu exemption expired in January.
Crypto is affecting prices but they aren’t causing the supply issues. I don’t think amd has produced many 6800/6900s. Ryzen 5800/5900 cpus are rarely in stock.
With the exciting side effect that some people don’t realise they’re taking ibuprofen so they take ibuprofen + Neurofen Back Pain Relief, and end up overdosing.
Gamers will generally try to avoid buying preowned graphics cards that have been used for mining. Running them 24/7 at full pelt tends to “age” they quicker than one that’s just been used for gaming.
That is not quite true. Miners that know what they're doing will undervolt their cards in order to improve power efficiency, which makes cards run cooler and at lower power.
You can’t tell from an eBay listing that a given card was undervolt or overclocked. On average it’s far more risky than buying a gamers card which, so their best avoided.
Also, undervolting isn’t always the correct choice it depends on how valuable the coin being minded is relative to energy costs. Someone mining in their dorm room for example may not be paying based on electricity useage.
My understanding was that gaming cards are pushed far harder, at higher temps, with fluctuating power and thermals, which causes more issues than a single stable power limit and temperature
Where is your understanding coming from? There is no such thing as pushing cards “far harder, at higher temps” than when mining or doing other compute tasks that run the GPU at 100%. Failure rates land squarely on the side of higher temps, and the reasons are well understood https://electronics.stackexchange.com/questions/444474/can-i...
You might be thinking of spinning disk drives rather than GPUs. A lot of people suggest that leaving an HDD powered up and spinning is better than spinning up and down frequently, due to the temperature going up and down a lot and the added wear on this mechanical device. This is completely different from a GPU though.
Higher temps are bad, but thermal cycles are equally bad or worse. Different things on the card have different thermal coefficients of expansion. Getting warm and cooling makes everything flex and stresses solder joints, wire bonds, and thermal interfaces.
Miner cards have longer, sustained high temps. This is bad for life.
Gamer cards have lots of thermal cycles. This is very bad for life.
Miner cards are more likely to be undervolted to improve power efficiency and thermals. This is good for life. (Lower peak temperature, less electromigration).
Gamer cards are more likely to be overvolted and overclocked to improve peak performance. This is very bad for life. (higher peak temperatures, more electromigration).
That’s testing for thermal cycling over wide temperature ranges or longer lifespans. GPU’s are used indoors and don’t have a very long lifespan.
The major risk factor for GPU’s is electromigration which is a major factor in GPU lifespan and directly relates to usage. A 40 hour a week gamer is extremely rare, but a mining GPU is pulling 168 hours a week.
Electromigration is a small risk factor in any kind of reasonable life. Especially if not overvolted (which is something that mostly gamers do-- miners are more likely to undervolt).
Solder fatigue breaking of solder balls is common. I have fixed lots of GPUs by reflowing them. GPUs do cycle over a large temperature range-- delta-T can be 50C+. While maps are loading, etc, you can have delta-T's of 25C+ every few minutes.
This is a thermal cycling induced failure mode. (Of course, a home oven doesn't accomplish proper reflow, so this is more of a "fix things for a couple months" trick as described in the posts).
I strongly disagree. The dominant failure modes of electronics these days are:
A) solder joint failure (thermal cycling)
B) capacitor failure (sustained heat).
Electromigration is a distant, end-of-life condition-- representing only a tiny fraction of failures of non-overvolted devices in a normal use period.
As your link itself says, in the top answer:
"But then there is an important question: How much does this decrease the lifespan? Knowing this, should you make sure that your graphics card stays cool all the time? My guess is no, unless an error was made at the design stage. Circuits are designed with these worst-case situations in mind, and made such that they will survive if they are pushed to the limits for the rated lifetime of the manufacturer. "
GPUs are not mechanical parts (well save for the fans but those can be replaced).
I would imagine thermal stress from heating and cooling would be the biggest issue - you don't get that under constant load.
Huge difference in wear, yes. But not in the direction you think, I think. Warming up an cooling down is more damaging for a card than running constant temperature. It 'jiggles' parts more.
Miners will under volt, under clock, and boost memory speeds in order to keep temperatures and fan speeds at a lower level. Card failures affect profits.
Both pages are focused on aerospace engineering (and perhaps fault-tolerant systems in general), so I'm not sure how I'd rate them as authoritative sources for the lifespan of electronics in general. Possible faults in a gaming GPU might not be as critical if they cause something to fail once a year, for example.
As someone with experience running a GPU mining farm for ~2 years, my anecdote: I had about 5% of cards break down during that time, and the majority of those were just fan failures.
There is the effect of 'electromigration' at least, which causes atoms of conducting materials to be transported over time because of the current (if I understand it correctly). That might be an issue over the long term, especially at the ridiculously small scale of chip manufacturing we've reached now.
Not only that, there is also diffusion caused by difference of concentration, which is increased by heat. And you have that concentration difference at the p-n boundaries/junctions in an IC.
Though I'm not sure how much actual damage you'd see in practice, whether the ICs tend to die with intense use before e.g. the capacitors mentioned above.
That's a bit of survivor bias. I used to buy truckloads of old PCs and recycle them when I was a teenager. I initially thought that this old tech was just really built to last, but in reality I was selecting the 1% or less that could survive being shoved around a rat-feces-infested warehouse at freezing temperatures before ending up in my parents' garage, to their dismay. Those survivors seemed to last freaking forever afterward, but again, that's because they were the random fraction that just happened to have that perfect balance of durability.
When I started buying new parts as an adult, the failure rates of e.g. GPUs were pretty disappointing in comparison to the biased expectations I had from those survivor PCs.
Haha, I also noticed that used parts that lasted ~2-4 years have a lower failure rate than new ones. All the ones that fail, fail early, and the surviving ones go through a sort of "extended burn-in" so to say.
I have a longstanding habit of buying my laptops refurbished - so, a few months of wear and sometimes a "scratch and dent". To date, every one of them has been a winner on longevity.
Same, only had a problem twice, with a failed USB/audio daughterboard on an Elitebook ($20 replacement) and a failed VRM capacitor on an MXM card (replaced it but the GPU itself failed after a year so I just got a new card, $100).
I'm really sad about new laptops having everything soldered on. If something fails, you either need a good reflow station (and skills) or you have to toss the whole thing, which is insane.
It also makes parts ridiculously expensive, like a system board for a Dell Precision now goes for ~$500 where it used to be under $100. All because the CPU, GPU, VRAM and even RAM are soldered on.
Honestly, I hate where all of this is going. So much for everyone going green.
The second article mentions thermal cycling -- I always thought that running 24/7 is actually less damaging then cycling (i.e. a gaming rig or a MacBook that does 30->90->30 °C multiple times a day.)
For the wear and tear of thermal cycling to pile up, it's not required to have reboots or shutdowns. All it takes is to have temperature fluctuations that in turn developt stress fluctuations, which induce fatigue wear on materials. Low amplitude cycles are better and less damaging than high amplitude cycles, but the damage is still there building up.
To put things in perspective, not so long ago it was believed that below a certain stress delta some materials were immune to fatigue and practically eternal. However it was soon apparent that that belief was not factual, and a phenomenon labelled very high cycle fatigue started to become a research topic. This type of fatigue is characterized by cracks being induced even at very low stress levels due to defects such as impurities and even crystal size in metal matrices.
Yes. Higher temperature leads to faster degradation of capacitors. My experience with running PC routers 24/7 in non-air-conditioned spaces (with ~40 degC in summer) is that after 5+ years systems that were rock-stable started to crash/reboot once per several days and must be replaced.
Absolutely, it's materials science. Most electronics like this aren't meant to be run full-tilt 24/7, especially not under the conditions that crypto mining typically occurs. Subpar cooling, potentially dirty air - so lots of dust and particulate.
Not to mention crypto miners will often have the GPUs overclocked on top of running full tilt to get every last hash out of it. It's about as brutal of a situation as you can get for a GPU.
>Not to mention crypto miners will often have the GPUs overclocked on top of running full tilt to get every last hash out of it. It's about as brutal of a situation as you can get for a GPU.
This is false because miners often undervolt to achieve better efficiency. The popular GPU mining algorithms are all memory-bound, so you can undervolt your core clocks quite a bit and still get >95% of the original performance.
That depends on how valuable the last 5% of your hash rate is and local electricity prices. Running an OC especially for memory can objectively be the correct choice.
For collage dorms for example rarely charge based on electricity use. Some apartment complexes also include electricity as part of the rent.
>For collage dorms for example rarely charge based on electricity use. Some apartment complexes also include electricity as part of the rent.
Those situations are rare. A student in a college dorm isn't going to be able to afford multiple GPUs for a mining rig, and if he's mining with one GPU, he's likely going to keep it when the crash comes rather than trying to sell it into a flooded market. Apartment mining is more believable, but even then power consumption is going to be an issue for them because of noise. They're also going to be vastly outnumbered (in terms of GPUs operated) by professional miners because most people don't have a few thousand dollars to drop on generating highly speculative assets.
Everything you just said is total nonsense. This idea that you are going to "wear out" a GPU is something people started saying when they were obviously bitter about not being able to find GPUs and seeing them being resold later for more than retail after they had been used for mining. There is nothing to back this up unless something melts.
Additionally, transistors can experience aging through various mechanisms [1], some of which are permanent and some of which can be fixed with a reset. Most manifest via a shift in threshold voltage of the transistors, which can impact the operating frequency of chips or stability of sensitive circuits such as memory cells. When ICs are designed they usually have a lifetime operating profile, i.e 5y or 10y with max voltage and operation for x% of that lifetime at a given temperature. Simulations are then run pre and post degradation to ensure requirements can be met. Different fabs such as tsmc and samsung will provide models/info for the transistor aging as part of their design tool kits.
Not if you have filters set up and clean/change them regularly. I'm assuming miners monitor their temperatures, as well, as long as it's under 80 degC, chips will work for years under load. Though VRM components are of rather poor quality compared to GPUs, CPUs and RAM, that's a major failure point.
> The CMP version is a straight-to-landfill product. As soon as it stops being profitable for mining it will have zero resale value, and become e-waste.
You are aware that this already happens to BTC ASICs when the new generation of ASIC comes out, right? Straight to the landfill. BTC is very wasteful, this is not new.
Agree on the "evil" part (or maybe "irritating"), but if I recall correctly from a few years ago, Nvidia has done this before with full access to the encoder only being available in the Quadro line.
> Nvidia does state that these GPUs "don't meet the specifications required of a GeForce GPU and, thus, don't impact the availability of GeForce GPUs to gamers." Frankly, that doesn't mean much. What does Nvidia do with a GPU that normally can't be sold as an RTX 3090? They bin it as a 3080, and GA102 chips that can't meet the 3080 requirements can end up in a future 3070 (or maybe a 3070 Ti). The same goes for the rest of the line. Make no mistake: These are GPUs that could have gone into a graphics card. Maybe not a reference 3060 Ti, 3070, 3080, or 3090, but we've seen TU104 chips in RTX 2060 cards, so anything is possible.
Combine with their after market value (see https://youtu.be/XfIibTBaoMM), this is no more than Nividia trying to spin goodwill whilst padding their bottom line.
Because cryptocurrency mining is now one of the biggest contributors to climate change. It is killing the planet. It is a moral imperiative to shut it down by any means necessary.
Perhaps overstated, but it is a noticeable percentage of global electricity use spent on what is at best an enabling technology for the somewhat quixotic pursuits of a tiny fraction of the population.
You should of course make your own judgments, but many people do not consider that a good tradeoff.
Keep in mind that bulk of crypto mining is quite centralized around the places with cheap elictricity so it's also mean cheapest source give highest advantage.
Unlike actual people mining farms can be placed right next to power source so it's easier to use Nuclear / Wind / Solar / Hydro for such purposes.
> Unlike actual people mining farms can be placed right next to power source so it's easier to use Nuclear / Wind / Solar / Hydro for such purposes.
While electricity transportation is an issue, it's not a huge one compared to the likes of storage and actually having something generating electricity. And, as it turns out, a lot of the mining farms just use coal (https://decrypt.co/43848/why-bitcoin-miners-dont-use-more-re...), with only around 40% of the energy coming from renewables. Even if they were only using renewables, it's still not zero carbon, since renewables also need to be built & since the total power consumption is increased, older coal plants might be used for longer.
For something, where the economic value is comparatively low.
The link in your article to the primary source of the Cambridge University survey is unfortunately dead. The alternative metric given (https://digiconomist.net/bitcoin-energy-consumption/) currency predicts 36.95 Million Tonnes of CO2.
I'd probably agree that this is still pretty high for what amount to digital hording.
On the other hand the seccond most popular blockchain Ethereum is estimated at half the energy consumption (https://digiconomist.net/ethereum-energy-consumption), has a plan to move to proof of stake and has a lot more economic value.
The space is still new and bitcoin has first mover's advantage, name recognition and trust (most hashrate, most reviewed codebase, etc).
https://cbeci.org/ estimates the annualised consumption based on a 7 day moving average of 120.87 TWh for bitcoin. To be generous let's assume all cryptocurrency mining is double that for 241.74 TWh.
That's only 0.14% of the total energy production, a lot of which comes from renuables (abundant cheap electricity means more profits for miners).
That's a grave overestimate of 0.73% assuming the most pollution energy production and even then it doesn't even compare to any other sector: https://ourworldindata.org/emissions-by-sector
The source for the BBC article (https://cbeci.org/cbeci/comparisons) even gives it as 0.48% of total electricity production and 0.55% of total electricity consumption. The graph it and the article shows us misleading as it only compares electricity and not total energy consumption.
That's neither true nor the reason why Nvidia is doing this.
What Nvidia is doing is like a dad making his kids share their sweets in a 50:50 ratio by taking advantage of the fact that one of his kids has a peanut allergy and the other has a walnut allergy and therefore 50% of the sweets contain peanuts and the other 50% walnuts.
gp's point is that even though it might have a positive effect on climate change, the decision was likely not made for altruistic reasons. It's like the apple charger debacle from a few months ago.
It's only evil if you believe in absolute software freedom. But in reality NVidia is perfectly within its rights to apply differential pricing for people who want to play games, and those who want to mine bitcoin.
Who can really blame NVidia for this? The actual target market for these products, gamers and the lower end graphics professionals, have been plagued by supply issues. In the long term NVidia stands more to gain by providing consistent supply.
NVIDIA should instead just require people to take an exam about either games or machine learning in order to be able to purchase a GPU. Problem solved.
There are, but these could be thrown put in a matter of weeks. As soon as bitcoin prices crash again, these cards are junk. By the time the next boom happens there will be a new generation of cards and these will be unprofitable. Nvidia is simply leveraging this predictable boom-bust cycle to make a quick buck selling a polluting product. This is Captain Planet-style environmental arrogance.
These cards aren’t being used to mine bitcoin gpu mining hasn’t been applicable to Bitcoin since at least 2014. These cards are being used to Ming alt coins
The same as any other industry. The coins are all tied together in the same manner as something like the auto industry, biotech, aviation or real estate. Individual prices can rise and fall on their own merits, but there is a larger tide as investors choose to move their money in/out of the entire sector.
As the counter point, I, as a consumer of a GPU for non-mining purposes, appreciates the effort so demand is reduced for a product by a certain market segment - and thus allows the price to not be influenced by the demand of a decentralized, global MLM scheme.
Also, you just don't buy that hardware - so it's not "what you can use your hardware for" - unless they're retroactively updating drivers to prevent the same use of already purchased projects.
I'd argue your concern is an issue only due to patents that may prevent a competitor from coming in and making a competing product if the main producers are engaging in bad behaviours.
Edit to add: I love the downvotes - they're such a strong way to sway someone's thoughts on something, so much effort gets put into clicking a downvote too - it must be serious business!
You shouldn't be, as this is yet another example of a company exploiting cryptographic signing and information asymmetry in order to lock you out of functional capability that you paid good money for.
That abusive practice is the only thing enabling this kind of business pivot. The pivot also doesn't even achieve what Nvidia claims to be setting out for, because they are splitting already limited capacity to support two distinct product lines thereby halving the overall supply available. If anything, this is just making sure everyone pays inflated prices due to scarcity.
What I'll say is that the absurdity of this trite dismissal presents a large undertaking for anyone to begin to try and teach you what cryptocurrencies are.
Most of the silicon valley billionaires acknowledge it as a promising technology, if that doesn't warrant some serious (read: many hours of difficult reading) consideration than expecting strangers on a message board to help you do it seems a bit entitled.
I'll grant you that trite dismissal of cryptocurrencies in general is quite common on HN.
"What I'll say is that the absurdity of this trite dismissal ..."
Are you talking about your trite dismissal of my stating it's a global MLM scheme - which it fits the pattern for precisely? And also a Ponzi scheme because the final latest adopters are left holding the bag - meanwhile wealth was unnecessarily transferred weighted towards the earliest adopters.
Blockchain is a promising technology, designing it like Bitcoin to align people through financial gain (of an MLM) has unavoidable-unfixable pitfalls. You're welcome to go through my comment history to read more about the actual solution. You're making assumptions too that my stating it's an MLM scheme isn't backed up.
I find it hilarious, the irony, of you claiming my comment is a trite dismissal - as yours has exactly the same depth as mine, with the only truth you state is that blockchain is a "promising technology."
Your holier-than-thou defensive comments, articulated almost poetically, is getting more common though - as the more intelligent sucked into the MLM scheme get defensive and their guard goes up - feeling the need to defend their position, needing the price of Bitcoin to go up and up and up; while you're financially incentivized to write and spend energy, and others like myself with counter-narratives aren't financially incentivized to do so.
>meanwhile wealth was unnecessarily transferred weighted towards the earliest adopters.
How many companies and technologies is this true or? Does this make them all MLM schemes?
Nothing about your response indicates you're giving me a generous interpretation at all, especially given how much of this response is you dumping rhetoric on me. Being snarky and in the same breath elaborately calling me "holier-than-thou" seems like we're just going to have a name calling arms race.
Similarly, the notion that more intelligent people are 'sucked in and defensive' is the true explanation, and you don't seem receptive to another interpretation to intelligent people thinking a technology is promising...it seems we may as well just leave it here. Thanks for your time.
It's entirely possible to extract value from a stock if the company represented by a stock is doing productive work.
Companies exist to make a profit and many of them do so by providing a service or by selling goods. The profit is paid out through dividends or stock buy backs. As an owner of a stock you have to do absolutely nothing to extract value from that stock. The company is automatically paying out money to you. There is no need to worry about finding a future buyer of your stock because you are assuming that people are analyzing the value of the company and buying it based on that analysis, because if the company keeps doing stock buybacks forever eventually your shares will be the only ones left and if you sell your shares you have cashed out your money without selling to a different investor.
With Bitcoin there are no dividends or buybacks. The only way you can realize value from Bitcoin as an investment is by hoping another investor is buying it from you. These complaints apply to gold as well. It doesn't generate revenue, it produces little value beyond industrial uses, all it should do is track inflation and tracking inflation is a very low bar and in practice cryptocurrencies and gold are outpacing inflation. Stocks too but if you were to invest based on fundamentals there will be a day in the future when your analysis will be spot on. Fundamentals would tell you to stay away from Tesla stock.
But this isn't where problems end, it's where they start. Currencies are not supposed to appreciate. They are supposed to stay the same over long periods of time with a small amount of inflation (around 2%). Bitcoin failed to become a currency and that's why it's worthless.
>you don't seem receptive to another interpretation to intelligent people thinking a technology is promising
I know you are not talking about me but I specifically stated that literally any other cryptocurrency other than Bitcoin is fine, even Ethereum is better than Bitcoin. The insistence that a single specific flawed cryptocurrency is worthy of defense is what really irritates me. There is no need to defend Bitcoin. Stop defending it. You can talk about any other cryptocurrency. You can talk about decentralized apps, you can talk about lightning, you can talk about defi, you can talk about uniswap or rubic, you can talk about prediction markets. Please, talk about anything other than Bitcoin and if you do talk about Bitcoin recognize its flaws and weaknesses.
I'm sorry this is a large block of text and skimming through it you're straw manning me.
The analogy was simply because GP posited that the pattern of wealth transfer from late to early adopters indicated an MLM scheme and that's clearly a shared trait among companies which are not MLM schemes.
It is not at all the defining trait of an MLM scheme.
I'm not trying to be rude, but the rest of your comments have nothing to do with anything I've said. I'm not taking a maximalist position, and I also don't agree that we should "stop talking about Bitcoin". It's evolving, so whatever you think or observe about it is on some level subject to change.
>I'll grant you that trite dismissal of cryptocurrencies in general is quite common on HN.
The unfortunate reality is that Bitcoin advocates all sound like people who bought into an MLM scheme. They desperately convince others that it has value and should go up.
That doesn't make it an MLM scheme any more than Tesla is.
And the counter balance I see generally are similar to the other commenter, people who will unload angry rhetoric but demonstrate no understanding of the Nakamoto Consensus algorithm much less address any of its merits.
I don't disagree, but I suppose the structure of my comment didn't convey the message correctly, that is, that NVDA is just maximizing profits while screwing over gamers because a) they can get away with it and b) they are an ever decreasing portion of their revenue.
The argument is simple really. By intentionally dropping the hashrate, you reduce the incentive to buy the GPUs for mining. This means that when or if the mining craze stops akin to 2018, there would be plenty of second hand 3xxx series cards to go around, which would only hurt NVDA's pocket. Recall that during the previous craze, NVDA was stuck with a huge inventory of GPUs after the craze ended, inventory that can not be recycled. IIRC they announced that they have more GPUs from 1xxx series available, iirc 1050tis.
NVDA's stance is that miners are the reason for scarce GPUs, but it is not. There has been scarcity since the 3xxx series was announced, similarly for AMD. The reality is that there is simply not enough silicon being produced; with AAPL hogging TSM's 5nm for a year, AMD and NVDA competing for 7nm wafers for AMD's CPUs, GPUs, and the console's SOCs and for NVDA's GPUs, and soon to be joined by INTC.
By cutting 3xxx production for gaming to accommodate crypto they ensure that there won't be second hand gpus flooding the market and eating their profits once the craze is over. Why is that? because it will not be as cost effective to miners to buy 3060s instead of the dedicated chips because of lower performance per watt. Once the craze is over, the miners won't be able to liquidate their dedicated cards, thus, the demand for GPUs for gaming remains intact, thus, they will be able to sell their 3xxx series and 4xxx series without getting screwed over by second hands and they will not be stuck with a huge inventory as was the case in 2018.
Edit:
Infact, the second hand 3060s would outcompete anything NVDA could offer in 1080p and 1440p gaming and anything NVDA could release with their 4xxx cards. In addition, there isn't enough incentive to even go beyond 1080p or 1440p for gaming. Which is why you will see NVDA pushing how great 4k and even 8k gaming is with their recent publicity stunt for 8k gaming on a 3090.
What you’re asking for is to get royally screwed by miners now - again - just so you might have the chance to get a cheap and very-used card later. And you’re blaming Nvidia for losing this imagined opportunity a year from now.
Right, like you point out, in 2018, Nvidia was making a lot of money on leftover 1080s. And they’re proposing to not do that this time. Are you suggesting that gamers are going to bypass a $330 card for a year in hopes that mining will crash and they can get a used one for $200 that’s been running hot for an entire year, and then after waiting, give up on that and buy a brand new GPU?
> The reality is that there is simply not enough silicon being produced.
You say potato, I say demand. You’re hand-wavily suggesting that mining is not affecting supply at all. That’s just not true. Mining demand is making an already bad problem (for gamers) much worse, just like it did last time.
What is the huge profit windfall you’re suggesting happens after this current mining bubble pops? Nvidia isn’t going to have an oversupply of cards to sell, and they aren’t hiking the price of the 3060.
This all seems completely speculative too. You’re complaining about a supposed future problem, and ignoring what this does for gamers today.
And it’s weird to frame this as gamers being screwed by Nvidia just because there could have been some awesome second hand market later. The fact is that gamers were totally screwed by miners in the first place when cards weren’t available for a year and the second hand prices were quadruple the list price. It’s not a huge gift to gamers that the market is flooded with cheap leftovers after they were hosed and frustrated and extorted by miners for a year or two.
Gamers are getting screwed by Nivida today since the silicone they normally bin as gaming cards are being allocated to mining cards.
In return they're only given the 3060 which is still profitable for miners and Nividia refuses to restrict mining on any other gaming Geforce series card.rxqy
Miners are also screwing over gamers by buying their gaming cards today. In future miners will be a positive for games as they flood the used market with gaming cards.
> the silicone they normally bin as gaming cards are being allocated to mining cards.
That's an assumption, it's speculation. If true, it means there will be no 3060s for gamers because they'll sell out as CMP, in contradiction to what Nvidia said. We will see if your assumption comes true...
I don't know why your speculative assumption makes any sense. If the chips that could be sold to gamers were being sold to miners instead and effectively "screwing" gamers, then it means that there's no point to segmenting the market, no advantage for Nvidia. It would be no different than letting miners buy the 3060s, and just not having CMP cards.
> In future miners will be a positive for games as they flood the used market with gaming cards.
That doesn't make up for the losses to gamers up front, it does not compensate by 100%, it can't. It would be better (for gamers) if there was no bubble.
> What you’re asking for is to get royally screwed by miners now - again - just so you might have the chance to get a cheap and very-used card later. And you’re blaming Nvidia for losing this imagined opportunity a year from now.
Re very used: no. Cards used for mining are a) undervolted and b) underclocked, both increase lifespan and I'd take them over an overclocked card used for gaming, plus they don't run that hot because at those scales cooling is expensive.
> Right, like you point out, in 2018, Nvidia was making a lot of money on leftover 1080s. And they’re proposing to not do that this time.
I never said NVDA was making money off leftover 1080s because there weren't left over 1080s, it was 1050tis which they are selling off again. The same 1050tis that they held onto during 2018.
> Are you suggesting that gamers are going to bypass a $330 card for a year in hopes that mining will crash and they can get a used one for $200 that’s been running hot for an entire year, and then after waiting, give up on that and buy a brand new GPU?
As I mentioned above, mining GPUs are usually in better conditions than in some dusty and improperly cooled rig and, simply put, there aren't $330 cards for anyone to buy. If people wish to buy $330 cards, they can buy them, but for me, at 21x9 1080p@60 there is no reason to upgrade from my used 1080ti, and had I really needed something better, I'd consider a used 3060.
> You say potato, I say demand. You’re hand-wavily suggesting that mining is not affecting supply at all. That’s just not true. Mining demand is making an already bad problem (for gamers) much worse, just like it did last time.
I never said mining isn't affecting demand but, a) the crypto rally started loooong after NVDA had supply issues, b) Amd has supply issues with their CPUs and they are utterly irrelevant to mining, c) this generation of consoles made with the same wafers can't meet the demand.
The demand was there before this cycle of mining frenzy.
> What is the huge profit windfall you’re suggesting happens after this current mining bubble pops? Nvidia isn’t going to have an oversupply of cards to sell, and they aren’t hiking the price of the 3060.
They won't because they are artificially limiting the hashrate, and they are producing more chips for mining. Last time around, people managed to get around the no display issue and could game on "mining gpus" (see Linus Tech Tips, they have 2 videos on the topic), but now NVDA is locking people outside with firmware making reusing mining gpus impossible.
I am complaining exactly because NVDA is taking care of their shareholders and their bottom lines. I am complaining because it goes against my interests.
Gamers are not screwed by miners, gamers are screwed because the pandemic caused a huge influx of people demanding chips, so much so, that even automotive manufactures can't find chips.
I am tired of this conversation and you are either taking my sentences out of context and blow them out of proportion.
> I am complaining because it goes against my interests.
So you are already certain you want to buy a used 3060 next year?
> I never said mining isn't affecting demand
You said "NVDA's stance is that miners are the reason for scarce GPUs, but it is not."
> the crypto rally started loooong after NVDA had supply issues
This is irrelevant. You mean this time, right? There wasn't a supply problem last time before bitcoin miners bought everything. It doesn't make any difference to gamers which contributor to scarcity came first, when you're talking about 3060 sales that haven't started yet.
> I am tired of this conversation and you are either taking my sentences out of context and blow them out of proportion.
I think you're taking this entire issue out of context and blowing it out of proportion. You claimed gamers are being screwed, when they're not, they're actually being helped. Many gamers here and on other threads are happy that Nvidia is taking steps to curb miner scalping of the 3060. The change here is to a single model, the 3060. Segmenting that market over this single model helps gamers today, and is not going to kill the second hand market for gaming cards. All other models will be untouched, and the 3060 will still be available second hand.
> So you are already certain you want to buy a used 3060 next year?
I wanted to buy a graphics card to run my models. I don't have the $$$ to buy a datacenter card for it and in general, I am affected by the demand and the scalpers. I was considering 3060 because my models are small and it is faster than my 1080ti, but at 21x9 1080p@60hz, it's not an upgrade for my gaming experience because that is capped by the monitor, and I am really not willing to buy a new monitor.
> You said "NVDA's stance is that miners are the reason for scarce GPUs, but it is not."
The reason is mentioned before, too much demand for chips, not enough fabrication. You instead, blame the scarcity on mining, mining is part of the demand but is not the main factor.
Linus and many others, me included, disagree with your opinion that NVDA is helping gamers. NVDA does not care about gamers, they are a corporation, they care for their bottom line and their stock price.
> Linus and many others, me included, disagree with your opinion that NVDA is helping gamers. NVDA does not care about gamers, they are a corporation, they care for their bottom line and their stock price.
This argument that Nvidia is a company is a straw man. Valve is as much a corporation that doesn't care about gamers as Nvidia. I guarantee that Nvidia cares about gamers, precisely because gamers have a huge influence on NVDA's stock price and bottom line.
I love Linux, but Linus has a vested interest in opposing Nvidia in public, and has a long history of making inflammatory remarks, even to people he works with. Using his opinion as support of your claims here undermines the credibility of the discussion, as far as I'm concerned.
To attempt to move the goalpost back to where you first placed them, the question here is whether reserving 3060 sales for gamers is good or bad for gamers. You claim it's bad for gamers vs the second hand market next year, and I claim it's good for gamers vs the first hand market today. We haven't actually disagreed about this yet, because you haven't addressed how your opinion affects gamers right now.
I agree that this might lead to a different second hand market next year. It is true that the second hand sales next year might not be as low without first having a big mining bubble that first prices gamers out and then crashes. I just don't believe that cheap cards later makes up for what miners have already done to you. If you do, it seems like you're ignoring some of the big downsides of what's happened before and what's happening now.
> I am affected by the demand and the scalpers. I was considering 3060
If you want to buy a 3060 now, I don't understand why you can't see the CMP announcement as a good thing for you and your own bottom line.
If you want to buy a used, mined 3060 later, and give your money to a miner and not Nvidia, then I do understand your points.
I am referring to this [1] Linus, not Torvalds. Watch the video. You have already decided that it is the miners that are screwing you over, and even when presented with evidence, you to the contrary, you keep shilling NVDA.
Sorry, you're right, I misunderstood the Linus reference as Torvals not Linus Tech Tips.
But so you count reductionist click bait opinion commentary on YouTube as "evidence"? Commentary that comes from a company who's profit motive is driven by clicks and attention? Saying that companies primarily care about their bottom line is tautological, it contains no information. It's still a straw man regarding whether or not segmenting 3060s is a net positive for gamers.
Last chance... you still haven't addressed the first-hand market effect on gamers of trying to get miners to buy something else. You haven't yet backed your claim that this is bad for gamers right now. I've basically agreed with what you said might happen, that the second hand market won't look like what happened in 2018. I can only assume that if ad-hominems are going to be the response to an honest question to what you said, then you don't actually have an answer?
Yes, I did, twice. The arguments in the video do not contain a single point that hasn’t been covered by people repeating the talking points here.
The thing Linus didn’t address, and the thing people don’t seem to understand about the basic economics of the situation is that there is no amazing second hand market getting flooded with cheap GPUs without first getting hosed by miners, having the prices go through the roof and not being able to get one for the next year. If that happens, then yeah, sure, the used ones will be cheap some time later, and you don’t get to choose when or even if mining crashes before the next GPU update. If that’s really what you want... enjoy.
That video is primarily an ad. It’s an ad for a VPN service and literally a ball-shaving kit. “If you imagine it any other way, then congratulations, you played yourself.” - Linus.
You’re making a huge assumption that they could be resold, even if they’re the same chips. This could be an entirely new market for chips that have yield defects in the graphics units, that will not ever work for games, and were previously already put in the trash.
And your theory doesn’t explain two things: - how this hurts gamers now, and - why if Nvidia just wanted profit and the chips are useful for games, why wouldn’t Nvidia simply do nothing and let the miners buy all the 3060s. Wouldn’t that be the most profitable thing here?
In fact, why wouldn’t Nvidia just hike the price of the 3060? They could make a lot more money on miners if they wanted to, and really screw gamers. But that’s not what’s happening. So from a gamer’s perspective, I just don’t get being mad at Nvidia rather than being mad at the miners who caused this to happen, again.
> And your theory doesn’t explain two things: - how this hurts gamers now, and - why if Nvidia just wanted profit and the chips are useful for games, why wouldn’t Nvidia simply do nothing and let the miners buy all the 3060s. Wouldn’t that be the most profitable thing here?
No it is not. If they mis-predict when the mining craze ends, they will be stuck with a huge inventory and no way to deal with it. The reason they will be stuck with the inventory is because the miners will be able to sell the second hand cards at a rate that NVDA simply can't compete with, so they end up as the bag holders just like it happened with the 1xxx series.
Infact, I'd wager that DLSS and RTX were used as a hacky solution to drive people away from the 1xxx series because the 1xxx series was more than sufficient for 1080p gaming - which is the most common resolution. The 3xxx series, even the 3060 is an overkill even for 1440p let alone 1080p.
This, to me, suggests that had NVDA not gimped the hashrate of the 3060, once the craze was over and the market was flooded with 2nd hand 3060, people would not be incentivised to buy anything above a used 3060 because it is a) an overkill for 1440p@120hz and b) it has much higher perf/$ than anything NVDA could offer with the 3xxx or 4xxx series and it wouldn't make any sense to go with anything higher anyways because 4k gaming doesn't seem worth it unless you are playing on a TV.
Strange theories on top of strange theories. The GTX 1080 is good enough forever? People are using 1080p forever? Games are staying put with the fidelity and textures and poly counts they had 5 years ago, and not improving any more? Last time I checked, there weren’t all that many games that run at 60Hz 1080p and never dip.
You're taking his words out of proportion. Just by looking at the steam hardware survey you'll see that the majority of gamers buy mid range cards and own 1080p monitors. The most popular games aren't the latest AAA titles either.
There isn't a game option survey but I'd bet most gamers with mid range cards run AAA games on medium as they value being able to play over a beautiful slideshow.
The GTX 970 is just borderline whilst the GTX 1080 is "pretty much sufficient" for modern games according to this benchmark: https://youtu.be/bhLlHU_z55U
What are you talking about here? My argument is rather simple and you are wilfully misinterpreting it to make your statement. If you want to game, at 1080p, there is very little reason to spend money on a big upgrade, even going with 3xxx series is an overkill.
> Last time I checked, there weren’t all that many games that run at 60Hz 1080p and never dip.
CITATION NEEDED.
The 3060ti is benchmarked at 4k hitting mid to low 50s in high settings in Shadow of the Tomb Raider and hitting 100fps at 1440p with 0.1% minimums at 90fps.
> You’re making a huge assumption that they could be resold, even if they’re the same chips. This could be an entirely new market for chips that have yield defects in the graphics units, that will not ever work for games, and were previously already put in the trash.
Chip binning is nothing new, the article even explains this phenomenon.
Fully agree. Didn't suggest otherwise. That doesn't mean that the CMP cards are binning in the exact same way. I mean, isn't it a guarantee that the binning criteria change here to accommodate CMP? Isn't it entirely possible that the new bins are opportunities to sell wafers that couldn't be sold to gamers at all? Nvidia already has market segments for compute cards with no video. Yes, I'm speculating. So is parent & the article. I'm not sure it even matters. What matters is that gaming supply is less destroyed by mining demand.
I am may be off base here, but isn't this how the Nvidia Geforce Now service work? Rendering on their systems and streaming to a dedicated client?
I have used this service to play a few different games which I have licenses too where the developer no longer supports Mac; oddly this seems to have gotten worse with M1 chips.
> oddly this seems to have gotten worse with M1 chips.
If a developer wasn't around or willing to recompile and test their game on x64 to support Catalina (which dropped support for 32-bit apps), then it's not surprising to me that they are also unable or unwilling to recompile and test and support the game on M1 ARM chips.
I don't know about that. Plenty of people are using Intel non-F CPUs which have iGPUs.
Honestly, the worrying part is that NVIDIA may block this. If I remember right, they did something similar back in 2017 with the P106 mining cards (GTX 1060 equivalent) when people bought them for cheap.
No? Why would they need to be in different machines? You can stream from one GPU to another, whether they're in the same machine or not. One GPU may be a cheap little one built into your motherboard. You probably already have it anyway!
And even if you did need two machines... again that's how gaming streaming services work. Cheap end-user machines or iPad or just TV, and then a gaming system streaming video output somewhere else.
>No? Why would they need to be in different machines? You can stream from one GPU to another, whether they're in the same machine or not. One GPU may be a cheap little one built into your motherboard. You probably already have it anyway!
Thats not really called "streaming". So given that somewhere in this thread there was a youtube link on how to do that, which required testmode in Windows - it's still not practical.
Then you need to use the word "streaming" correctly. Video streaming is not copying data to a different frame buffer. Nvidia with their Optimus technology doesn't call this "streaming". You used the same term for both.
None of these links are related the conversation here. First one is even empty. Neither for "video streaming", nor for frame-buffer (Optimus) stuff.
Make up your mind. If I counter your "video streaming" argument, you can't come with "but I meant streaming on the same device" without even mentioning either a technology, or the most important thing there (the frame buffer). Now you want streaming to mean something that is used in game engines. How those 3 links help in making the mining card usable is for you to proof.
If it has CUDA support it might make sense post-hashing for a cheap deep learning setup. I have two P106-100s (last gen "mining-only" cards) that were very cheap (compared to a GTX 1060) secondhand. The only hit is in PCI-E bandwidth if they decide to go with nerfed PCI-E 1.1 only.
Will CMP versions really be so single use as to be only good for mining?
I want to know if I could use one as a second card for 3D rendering. Although I guess the render farm people will know about that and suck those all up...
I am hoping to repurpose the gpu/cmp for a hashcat rig once cards no longer provide income for miners. I fear I no longer will be able to use GPUs for this, and are crossing my fingers for cmps
So people are going to build entire data centers to look for combination of bytes that yield a hash that is funny looking. And when it doesn't become profitable to do so anymore, the e-waste issue it generates is somehow the fault on the chip maker ?
Yes. When a chip maker takes a multipurpose product that can be reused when no longer fit for one purpose, and instead makes it single purpose and not reusable for market segmentation reasons, they are absolutely somehow at fault.
While officially impossible, theoretically it should be work, in the same way as laptop GPUs (the dedicated GPU gives the rendered image back to the integrated GPU for seamless switching instead of outputing it directly).
There's used Tesla M6 cards (top of the range Maxwell GM204 cores, 8GB of RAM) going for really cheap sometimes. They have no video output, but you can use them in laptops with hybrid graphics thanks to the output being routed through the CPUs IGP as you said.
Needs quite a bit of driver fuckery to have it recognized as either a GTX 980m or a Quadro M5000m, and you lose HDMI/DP output, but it's not a bad card for an upgrade if you only use the internal display.
I'm surprised there are no MXM to PCIe x16 adapters, these kinds of cards are cheaper than desktop ones (due to the market/pricing being totally screwed) while providing similar performance.
Depends on what NVidia takes out. If they remove the triangle fill units, they're unsuitable for graphics but OK for "mining". That leaves the more general purpose parts that let you do arithmetic in parallel.
Those might also be useful for machine learning.
first they came for the miners... As Nvidia opens itself to those games, i can see how Russia or China say would legislate for the drivers to have to refuse to run say specific crypto algorithms or specific neural nets training if it includes specific names, terms, phrases.
"RTX 3060 software drivers are designed to detect specific attributes of the Ethereum cryptocurrency mining algorithm, and limit the hash rate, or cryptocurrency mining efficiency, by around 50 percent."
open source drivers have been long due. Now there is real $Billions staked on that need.
Yes. I am using Nvidia GPU even though I know it's so non-free, so unfriendly to the point Linus gave the middle finger and all. But I thought it's okay since I were to use it for proprietary video games anyway. But I have never thought they are this evil. With this move, Nvidia products are forever included in my boycott list.
Not a fan of miners disturbing GPU availability and pricing but this is a bad move. It’s not an engineering move, it’s a marketing move, and I’m sure the first attempt at doing this will come down to something just as shallow like a PCI PID check or some resistor strap check. Someone will find it worth the cost to make an FPGA based PCIe bridge that responds to the right knocks in the right way.
I suppose they recognize a specific algorithm. They already do special-casing for a number of game titles, and have been for a long time. Not a problem to match one more pattern, I assume.
Not the first time they do this, either. They started limiting performance via firmware/software since Fermi iirc. Basically, nVidia's GTX and Quadro cards use the same exact chips but the latter have additional features and higher performance in professional applications.
GPU mining is going to become much less attractive this year due to Ethereum moving to Proof of Stake, right?
If this is the case (perhaps an expert can chime in) - it makes sense to limit the hash rate now, to prevent these cards being dumped later this year? And to redirect miners towards higher-margin products in the meantime, whilst ensuring better supply for gamers (who are the actual long-term customers).
The ETH2 'beacon chain' is a meta-chain that's supposed to checkpoint a bunch of 'sub chains' for scalability. But they have launched it with many big questions unanswered.
What does a 'sub chain' actually look like to interact with? How do you coordinate with many of them? All these questions have answers in theory, but not answers in solid production ready interoperable code.
Furthermore some of these problems like 'how do you store a bunch of sub-chains' are questions not even properly answered in ETH1 nodes for the much simpler one chain case.
By launching the beacon chain early the organizations developing ETH2 can validate and make money while all these questions are figured out. This is where the money goes instead of the improvement of ETH1 since that's a tragedy of the commons as this point.
ETH2 is only worth money if these questions are figured out and ETH1 is somehow brought under the beacon chain's governance. Exactly how this is to happen... well I haven't even seen anything credible on this. Greenfield addition of chains under the ETH2 beacon chain is generously described as 'incomplete' moving ETH1 under ETH2 is much more challenging and I have not seen a plan with any level of detail.
Given that everyone is making bridges to ETH1 now and typical development timelines. I would give a medium to high chance that ETH2 will miss it's window by a couple of years and activity will move to other faster chains that are available now and capable of siphoning off traffic from Ethereum using bridges until they reach their own critical mass.
If I had to put money on it I would place 'proof of stake Ethereum' (defined as ETH1 under the POS beacon chain) more than 2 years out.
I could see the sub-chains (greenfield) working by EOY, although I wouldn't grant it a high probability.
Also if I had to put money on it I will bet on a halt of the Ethereum 1 chain for a time greater than 24 hours within that period, due to lack of maintenance on ETH1 nodes and increased stress from DeFi activity.
-----
ETH2 is kinda like 5G, yes there is a very real 5G network protocol, but in terms of pubic messaging, everything is 5G and communicating anything about how it works or when it will be available is filled with pitfalls and nuances that are difficult for the highly technical, much less the general public, to understand.
IMHO the most important endpoint for ETH2 isn’t ETH1 under the beacon-chain, or greenfield projects under the beacon chain, but rather major projects that have already been in development for years, and already had a testnet targeting some alternative sub-chain substrate (e.g. Polkadot, Near), re-evaluating their alternatives at mainnet launch time, and choosing to use ETH2 as their mainnet substrate chain instead.
That could create a lot of momentum/adoption for ETH2, very quickly. And it wouldn’t take much: all these substrate projects are intentionally architected so that you can just develop for them as if you were developing for an ETH1 side-chain, and defer all the operational questions to network launch time. They’ve intentionally commoditized themselves!
So, as long as ETH2 is the best choice for a substrate when these projects go to mainnet, it’s what they’ll pick. And, for many reasons (that all mostly come down to “lifetime cost of bridging to either ETH1 and/or chain-foo-where-DEX-foo-lives”), ETH2 may be the best choice.
(Imagine if you developed your project against some Postgres-ish-DB-aaS cloud provider like Greenplum Cloud, but only used regular Postgres features; and then, when you went to production, you looked around and decided that Amazon RDS was good enough—and on top of that, required no re-engineering, since you weren’t doing anything fancy.)
> and choosing to use ETH2 as their mainnet substrate chain instead.
But why exactly? Until ETH1 is in the ETH2 chain ecosystem ETH2 is just another blockchain platform, providing no network effects.
ETH2's big selling point was higher throughput but bridges are already starting to commoditize access to more throughput from ETH1. ETH2 could provide smoother access than the bridge interfaces provided by other chains. (I'm speaking of bridges both to and from ETH1 and within the bridge blockchain's ecosystem like IBC)
It's not that I don't think ETH2 could win this. There's a clear path to victory.
- ETH1 integrated ASAP, going all in on expanding ETH1 capacity and bringing it under ETH2
- Provide a subchain interface that's better than IBC or any other cross-chain interface on the market
- Fullnode tooling that really works well (partial syncing for block availability, seamlessly syncs multiple chains etc)
Once you have these you can keep ETH1 users via network effect and provide access to more capacity more easily than other potential platforms.
But I don't see a clear plan from ETH2 to do either of those things. Cross chain communication is a theory-only problem devoid of the extremely polished tooling it will need to win and ETH1 under the beacon chain doesn't even seem to have a plan.
The difference is that ETH1 will bridge to ETH2 in the future, and therefore by choosing ETH2 as your substrate, you’re predicting that transaction costs for sibling-transactions crossing from your chain to ETH1 will become cheaper when that happens.
A similar promise cannot be made for any other substrate network, as ETH1 isn’t going to bridge to any other substrate network.
Since it’s not like sub-chain mainnets can pack up and move house after they’re launched, when launching one, you have to make choices based on how things will be, rather than how things are. It’s sort of like choosing a city to build a corporate headquarters in, based on what the civic infrastructure is likely to look like in 20 years.
There are bridges to and from Ethereum right now. I use them every day because I need a fast sidechain for my application.
Choosing ETH2 is predicated on the assumption that ETH1 will be bridged more cheaply and easily than existing bridges can provide and that the opportunity cost of waiting is worth it.
I had to make that decision personally, decided I can't wait for ETH2, so we're using a bridge that's available today and building an ecosystem elsewhere.
Several other product leads that I know are making the same decisions, they can't just put their lives on hold and trust that ETH2 will solve problems for which there's no public timeline or even a public solution.
What are your thoughts on eip 1559 and it's impact on mining profitability? That is supposedly coming this summer and my understanding is that it fixes in place gas fees, and burns off some fees that would otherwise go to miners. I'm not sure how large of an impact that will make on mining profitability, but I was under the assumption that some miners were going to pump and dump their cards right before they anticipate gpu profitability going down, so they can make some scalper money. I was thinking that miners would be redistributing their supply in the next 3 months or so.
Nvidia announces mining GPUs, cuts the hash rate of RTX-3060 in half (nvidia.com) 454 points by bcatanzaro 2 days ago | flag | hide | past | favorite | 731 comments
At some point nvidia would be better off mining with these gpus than selling them.
The oddest solution I can think of is gamers lease graphics cards at a low low price and nvidia mines on them in the off-hours. With a big enough operation they could force all the other miners into becoming resellers. It's free, distributed electricity and it gets the gamers off their backs.
(While I have your attention, I'm pretty sure proof of work is the paperclip maximizer sci fi warned us about.)
> The oddest solution I can think of is gamers lease graphics cards at a low low price and nvidia mines on them in the off-hours.
This would be a complete non-starter in markets with high electricity costs or temperatures. Running a GTX 3090 at 100% load during the summer where I live would cost $140/month and will heat up my room to about 65°C.
> I'm from Germany and our power is around 35 cents per khw.
I don't know if this is a typo or just a slightly different way of writing it in German, but now I'm trying to work out exactly what a "kilohour watt" would be. I think it's just a millionth of a kilowatt hour? Definitely wouldn't want to be paying those prices!
I can't find anywhere genuinely using that unit, although it is mentioned in this 1979 9th grade Electrics curriculum as part of a True-or-False quiz at page 370: https://files.eric.ed.gov/fulltext/ED182438.pdf
Time to file it in my list of "technically correct but extremely annoying units" along with "5Mm" (5000km) and - mainly in the context of batteries - "10Ah" (10000mAh).
I don't think Nvidia (or most companies) should be deciding what to do with the tools they sell. I don't want someone telling me what software I can run with my CPU or GPU any more than I want someone telling me what I am allowed to build with the hammers or screwdrivers I buy.
Fuck miners. They drove the cost of the 1000 series cards up when btc peaked at 14k. Then the 2000 series was the same price as when the btc peaked. Now the 3080 is selling at 2000+ euros.
High chance of changing your mind when these tools get hoarded by those who do not build anything at all and only race arms. Mining business is basically that.
I think the market is working as expected here, since a higher price is placed on the ability to decentralize finance versus having a few people playing video games at higher quality. If you're interested in machine learning there are resources specifically for that, such as Google's Tensorflow Processing Units (TPUs)
“higher price is placed on the ability for few people to participate in speculation and money laundering versus having better 3d modelling in realtime for everyone”
People who use tools to make money are going to spend more on them than those use use tools for entertainment. So if a company wants to prevent pros from buying their consumer-grade products, then preventing the consumer-grade tools from being used for pro features is a great way to do it.
nVidia has been doing this for decades with their Quadro cards. They use the bios/drivers to hide the features necessary for CAD out of the GeForce cards.
Been trying to build a PC for my daughter for weeks now, waiting for just a half decent GPU to become available. You can't find the 16*, 20*, or of course the 30* line of Nvidia available anywhere (except overpriced on Ebay), it's nuts.
I'm not in a rush or I'd just buy a prebuilt PC which you can still find with these cards, but I wanted to show my daughter how to build a PC. Guess it will be a few more months...
I am happy with this kind of heavy handed move for once if this is intended to reduce demand for graphics products for their other uses and make them available for people who intend to use them for work and gaming.
Maybe I will finally be able to afford top of the line GPU.
I mean, bitcoin miners can still pay high prices if they want but finally there will be GPU that will not be encumbered by mining craze.
The GPU market at this point is supply limited. There is not much else NVIDIA can do at this point besides trying to cripple their products for mining if they want gamers to have any supply. Since this all depends on their signatures not being circumvented, I suspect it wont last too long.
As for the mining specific cards, all that ensures is that there will be swats of silicon that will never be accessible for gamers. That they try to spin this part as pro-gamer is very cynical.
> The company also limited the mining performance of the soon-to-be-launched RTX 3060 cards to roughly 50% of the normal performance
What does that mean? What has NVIDIA actually done on those cards? Also, if this doesn't effect 3050, 70, 80 , Ti's, Quadro's and Tesla's - does it really matter all that much?
Somewhat related, you can still find reasonably priced pre-built computers with latest-gen graphics cards. If you're in the market for an entirely new computer, this would be the cost-effective way to go. The HP Omen is one example (stock is still there from 3rd parties, but admittedly limited).
Probably not the main reason and may not even be a reason. But this seems like it would work as a clever way not to have the market flooded with older (but still very capable) graphics cards once the value of them in a mining rig stops making sense.
Curious to see how ethereum participants react to this. Will it just become a race to defeat nvidia’s mining-detection logic in order to up the available supply of cards?
(caveat: don’t know enough about contemporary ethereum to know how this works with PoS)
Well at least you could say it potentially helps the environment now that you won't have bunch of gamers crypto mining, or does that not matter the moment there is a drawback that affects us?
I’d argue it’s worse for the environment. These so-called mining cards can’t be used as regular GPUs since they don’t have any outputs - they are compute only. Once they outlive their usefulness they become e-waste since there would be no sustainable secondary market. At least a gaming GPU could be re-sold or otherwise repurposed.
I'm probably going to be downvoted into oblivion but I'm genuinely curious: Does anybody else think that the "mining is bad for the environment and gamers can't buy the cards" argument is a bit weird? After all, if all these cards weren't used for mining but for gaming, wouldn't the environmental outcome be the same?
I mean, is there any inherent upside to burning tons of energy for games vs burning it for a cryptocurrency? How much electricity do all the gaming PCs and gaming consoles in the world consume vs Argentina? To me, it feels a lot like Nascar or Formula one burning tons of fuel.
I don't think that it is somehow bad to play games, but I wonder why the one (crypto) is always criticized into oblivion for the energy consumption, while the other isn't.
> I mean, is there any inherent upside to burning tons of energy for games vs burning it for a cryptocurrency?
You presume the total energy consumption would be the same whereas it'd be more likely that GPUs that are used for mining will be used 24/7 whereas gaming is likely to be performed by a human being who cannot or would not utilize the GPU nearly as many hours of the day.
> GPUs that are used for mining will be used 24/7 whereas gaming is likely to be performed by a human being who cannot or would not utilize the GPU nearly as many hours of the day.
Yup. And even if they could utilize them 24/7, gamers generally* won't have 10+ cards going at once.
*:I'm sure someone has some weird 360 degree flight simulator edge case.
If you reduce this argument, the best way to help our planet is to have fewer children.
I’d argue the target total fertility rate per woman should be somewhere below one. To hell with economic growth that relies on an ever increasing human population.
I'd much rather change my extremely wasteful (can't overstate how much, from someone that makes an effort to minimize it) lifestyle, and much rather use the force of law so that everyone has to, before advocating or forcing such an asinine policy.
Even if strange when it comes to animals, humanity progresses following rules similar to classic evolution, aka by randomly mixing stuff until something works. We've gotten (only) pretty good at selecting and amplifying what works extremely well, but we still need a steady stream of randomly arranged characteristics to enter the pool. Imagine if the next Newton isn't born (or is born 200 years later) because a couple decided to not have children to "save the planet". Perhaps this figure would've been a key piece in a breakthrough discovery about energy, climate, terraforming, public policy....
I think it very much depends on how one experiences life. The two ends of that scale are largely incomprehensible to each other.
Your point about "the next Newton" is unrelated, and IMO misses the mark. It's not coincidence that Newton, Hooke, Boyle etc appeared in the same place at the same time, and it's not because there'd been a crippling shortage of randomly arranged characteristics before that. The right characteristics aren't enough, you also need the leisure to develop them (which implies material surplus) and a society that makes sufficient use of scientific discoveries to value and propagate them. Nobody would have heard of Isaac Newton if he'd been born a subsistence farmer. I'm sure lots of potential Isaac Newtons were, and in many ways that's a tragedy.
> The two ends of that scale are largely incomprehensible to each other.
Absolutely. Just to set the record straight, I am not anti-birth and I think I agree with GP more than I disagree. We must do everything we can to reduce the carbon footprint per person. I am just logically following what my parent post said.
I didn't even know the word anti-natalist. I want all children who are born to be healthy and reach their full potential as productive adults. One child is a blessing. Two is also fine (I have a sibling).
At least in a developed country, if someone has eight or more (not born at the same time) children, they are terrorizing the environment in my eyes. I don't see how you can justify that with access to decent healthcare and the infant mortality rate is under ten.
I mean if you live in a place where infant mortality rate is over a hundred (a quick web search shows that IMR in Afghanistan is over 110 which means a hundred and ten die before the age of one of every thousand infants born), I can't imagine the pain and suffering the parents must be going through.
We clearly can do better. The open question is how.
Your POV comes across much more sensible in this answer, to me. It's become clear how families in societies tend to make fewer children as the civilization progresses. People in some underdeveloped countries try to make a large number of children because the child mortality is so high, and there's a dire need for manual labor. So one way to do better is work towards improving the situation in underdeveloped countries. In developed countries very rarely you see mothers of 8, and a target fertility rate of "below 1", as you stated, is nonsense.
You'll have to pardon my bluntness, I've seen many self proclaimed environmentalists straight up advocate and encourage women to never have children for the sake of the planet.
> I've seen many self proclaimed environmentalists straight up advocate and encourage women to never have children for the sake of the planet
OK, I think you're talking here about VHEMT [1] and the like, which is only very tangentially related to traditional [2] antinatalism. The latter is more a reduction-of-suffering philosophy, in much the same way that vegans/vegetarians/etc believe that it's better for an animal never to live at all than to endure the conditions of modern meat farming.
Obviously neither movement stands much chance of success since neither stands any chance of convincing an entire population. If they have any effect at all, the former will tend to eliminate genes associated with environmentalism, while the latter will tend to eliminate genes associated with unhappiness (which doesn't sound so bad).
This should be weighted against these mining cards becoming e-waste instead of entering the seccond hand market. The energy used to manufacturea GPU is a significant part of its overall impact.
I don’t know anybody who will willingly buy a second hand video card used for mining cryptocurrency. They’ve been pushed so hard, for so long, that their useful lifespan is pretty much used up. It’s a big part of the reason they’re being sold and not continued to be used.
Plenty of second hand mining cards changed hands from miners to gamers two years ago, for the most case the price was very competitive.
Depending on the card some would have been run underclocked and even undervolted. I personally know friends who purchased mining cards that are still up and running.
For miners they're not being sold becuase they're not useful but because they don't have the cash on hand to continue the scale of operation when mining is no longer profitable. It's not that difficult to run the numbers yourself to verify this.
> Does anybody else think that the "mining is bad for the environment and gamers can't buy the cards" argument is a bit weird? After all, if all these cards weren't used for mining but for gaming, wouldn't the environmental outcome be the same?
I never had that specific opinion, but I have another point of view.
Games bring inherent value to the table. That is, if you didn't play games you would be reading a book, going outside to play, browsing the web... Games fill a purpose and bring value. Whether or not I believe our current consumption of videogames is acceptable is irrelevant.
I don't see tangible value in crypto. Crypto is worth what other people are willing to pay for it. If you wanted to invest in an asset that does not depend on the currency you could buy gold.
(I had a paragraph here making a point about international transaction costs. I removed it because I started to find it offtopic. Long story short: I find the societal cost of crypto way too high).
> I don't see tangible value in crypto. Crypto is worth what other people are willing to pay for it. If you wanted to invest in an asset that does not depend on the currency you could buy gold.
Try convincing any crypto "enthusiast" of that, though. Literally none of them have ever been able to explain to me why cryptocurrencies should have value, but that doesn't stop them from proclaiming it the best thing since sliced bread.
It comes down to opinion but I'd argue there is inherent value in a trustless medium of exchange, a trustless explicit protocol for lending, market making (exchange) or gambling or the platform that powers them.
I'd also agree that price bubbles happen and future speculation becomes a dominating factor in their price action, but this doesn't mean they don't have inherent value.
It isn't trustless. If it's finance, it's trust based at the very core. Somebody is trusting somebody else to recognize the value up for offer, and being capable of converting it to a utilizable form.
This is what crypto-enthusiasts must be blind to. We could be using bottlecaps to do the same bloody thing. Nobody wants to though. Why? Because nobody else takes the value assertion of a bottle cap seriously. As long as people keep buying into the hype, and the enthusiasm is kept up through selective refusal to accept the realities and externalities of the process, then the gravy train continues.
It's not about trusting if the bottle caps are worth anything or are useful, but if the supply and exchange of bottle caps follows the financial rules that are dictated.
That's because the value and effectivenesd of cryptocurrencies as an un or poorly regulated store-of-value is dependent on their not being any doubt as to their value.
As with most things in finance, it's all trust at the bottom of it. If there is any doubt, no one will be around to be left with the bag.
i used to be one... there is no real argument. it’s just mania driven by the price going up and hoping for the price to go up. any argument against btc stands between them and the moon so they respond with anger. it’s like on reddit when gamestop stock was 400 trying to tell people to sell was met with anger
> I don't see tangible value in crypto. Crypto is worth what other people are willing to pay for it.
That's pretty much true for anything that has financial value, though. Setting aside the chemical properties of gold (conductivity, resistance to corrosion, etc.) that make it more valuable than some other metals, pretty much every financial construct in human culture is only valuable because we decided it was valuable. That's not to say everything is inherently valueless, but crypto is not too dissimilar to, say, digital coins for a game; you just can't usually swap it back in the latter case so it has little/no tangible value. Similarly, the value of stocks, foreign currency, even money itself fluctuates daily. As long as you can swap it back to "real" money, I see it as having tangible value.
That being said, I personally choose not to get involved in any cryptocurrency because I would rather invest in more "established" forms of value, like stocks and bonds and whatnot.
> That's pretty much true for anything that has financial value, though.
Correct. Maybe a better statement would be "I don't see that the inherent value of crypto is bigger than that of the dollar or gold". This gets even more complicated when you take into account the cost of the transactions that I have discussed with another user.
Although you may disagree with the current value of most cryptocurrencies would you at least agree that their inhenrant value isn't 0?
Newer projects like Uniswap, AAVE and MakerDAO (for DAI) are uniquely new and novel concepts that can't be accomplished trustlessly without blockchain technology.
Bitcoin has a lot of intangibles (name recognition, trust, hashpower) and for many is a more intersting asset (in what it represents) compared to gold.
> Although you may disagree with the current value of most cryptocurrencies
I would like to clarify something. I didn't say I disagreed with the value of cryptocurrencies (that is, its value in dollars). I said I find their societal cost way too high. In other terms, (Inherent Value of crypto < Societal Costs of crypto).
> would you at least agree that their inhenrant value isn't 0?
I agree with that. Although my previous answer stated "I don't see tangible value in crypto" I suppressed the paragraph that said that crypto may have some inherent value when it comes down to international transactions (or transactions in general). If international transactions in crypto are cheaper than with banks, clearly those who want to make these transactions do see some inherent value in the currency. That is, its value would emerge from its transaction properties.
I did suppress the paragraph because I just don't know enough about the fees of such transactions. I think it is reasonable to expect Banks and Financial entities to offer transaction fees lower than Crypto for all use cases. If that happened, then I just don't see any value in crypto whatsoever.
I have said it a few times on HN: I see more potential into a "Crypto Dollar" than in crypto in general.
Crypto is an investment the same way $GME or playing the roulette is an investment. Better term would be speculating or gambling. Nothing wrong with either, but let's call it what it is.
Comparing it to roulette is such a hyperbole. Would you say that penny stocks are playing the roulette? You may argue they have similar odds but there are fundimental differences between them.
There's still a difference between buying $GME and playing roulette. Volatile as it may be the chance of loosing all of your investment is greatly lower with meme stocks.
Are you saying there are people out there that are happier investing in "crypto" than in "stock X" or "gold" or "bonds"? It just makes them happier that that particular name appears on the screen?
I don't doubt that is the case, but that kind of reinforces my point. The societal cost of crypto is too high. If we have to justify the electrical consumption of crypto because people enjoy it more when they see its name on their screen, we are clearly doing something wrong.
But diamonds do have some pretty awesome mechanical properties [0]. They do have intrinsic value. Sure, the intrinsic value of diamonds < value of diamonds, but at least when you mine diamonds a percentage of it will go into use.
We can do some quick estimates to compare (using a 3090 because I helped a friend do a bit of mining on theirs and I know what figures are plausible):
First some variables:
RTX3090 idle power usage = 21 Watts
RTX3090 mining power usage = 270 Watts
RTX3090 gaming power usage = 350 Watts
Mining energy usage:
270W \* 24H = 6.48 kWH per day
Gaming energy usage:
(low, 1 hour per day) 350W \* 1H + 21W \* 23H = 0.37 kWH per day
(med, 4 hours per day) 350W \* 4H + 21W \* 20H = 1.82 kWH per day
(hig, 8 hours per day) 350W \* 8H + 21W \* 16H = 3.14 kWH per day
(ext, 18 hours per day) 350W \* 18H + 21W \* 6H = 6.43 kWH per day
So even with someone who I'd consider a high volume gamer (8 hours per day) uses less than half the energy of running a miner for a day. You'd have to game for over 18 hours per day to use more energy than a miner. Of course both the gamer and the miner end up using a chunk of metal and plastic that will end up in a landfill so that bit counts the same.
Note: From what general figures I've seen online the energy works out roughly the same for the rest of the recent RTX series cards (some such as the 3060Ti work out a few % more favourable for mining as they are more energy efficient overall)
Why not both? I’m paying for my card by mining when I’m not gaming. I checked with a watt meter and I’m using 270-300 watts mining total (cpu, gpu, etc). It also heats my room so it’s more efficient than running a heater or using a space heater (1500 watts).
Good comparison though it misses some externalities.
Gamers also use monitors(~40W) for many hours a day, and run the whole machine with 1 card while many miners minimize all other costs but the GPUs. Also gamers typically damage their cards a bit faster and change them for newer ones faster on average (miners keep mining with very old cards still).
Sure, ultimately gaming is entirely optional, but alas it's no different than other pastimes in that regard. Games make a lot of people very happy though and provide employment to millions (?) of people, so personally I'd put it higher in the needs hierarchy than crypto-currencies. Not being able to buy the latest graphics card due to high demand surely is a "first-world problem", but for NVIDIA it might become a problem if the situation persists, as they risk to anger a lot of loyal customers.
Also, gamers and professional users will likely still buy graphics cards in 5-10 years (and more so), while I'm pretty sure miners will have either given up or switched to more custom solutions like ASICs by then (which has already happened with Bitcoin). So NVIDIA also risks losing a long-term market to please a potential short-lived market, hence I think they do the right thing by trying to disentangle these two market segments for good.
> Is this really good news, or is this just Nvidia playing both sides? To be clear, these CMP cards are still the same exact silicon that goes into GeForce and Quadro graphics cards. They don't have video outputs, cooling should be improved (for large-scale data center mining operations), and they're better tuned for efficiency. But every single GPU sold as a CMP card means one less GPU sold as a graphics card. What's perhaps worse is that while miners can still use consumer cards for mining (maybe not the upcoming RTX 3060, depending on how well Nvidia's throttling works), gamers can't use these mining cards for gaming.
> Nvidia does state that these GPUs "don't meet the specifications required of a GeForce GPU and, thus, don't impact the availability of GeForce GPUs to gamers." Frankly, that doesn't mean much. What does Nvidia do with a GPU that normally can't be sold as an RTX 3090? They bin it as a 3080, and GA102 chips that can't meet the 3080 requirements can end up in a future 3070 (or maybe a 3070 Ti). The same goes for the rest of the line. Make no mistake: These are GPUs that could have gone into a graphics card. Maybe not a reference 3060 Ti, 3070, 3080, or 3090, but we've seen TU104 chips in RTX 2060 cards, so anything is possible.
Nivida isn't risking anything.
They boost their sales now by selling mining cards knowing they won't enter the seccond hand market to compete with their new products.
They gain good PR with their mining locked 3060 cards. Gamers praise them without internalising that a number of mining cards would have been binned lower and sold, but instead an artificial restriction allows Nivida to allocate behind the scenes.
They're actively not winning good pr because gamers realize they're being screwed despite nvidia's rhetoric, and miners can now no longer recoup some money from old cards they no longer need by reselling to gamers. And, everyone realizes how shit this is for the environment.
You can say Nvidia is making a smart business decision (questionable) but to say this isn't anything other than a pr disaster imo is incorrect. The people who were the target of their press releases saw right through it.
The initial sentiment in hacker news and reddit threads was overwhelmingly positive for nividia and against miners.
Now that some time has passed and Linus (from LTT) has released his video I'd say the sentiment is still mixed. I'm still seeing many tweets praising nividia for finally taking a step for gamers.
It's true that cheap cards were available when bitcoin crashed in 2018. I'm not arguing that cheap cards weren't available. I'm saying the feeling that cheap cards are a huge positive for the gaming community is missing the forest for the trees, and forgetting the pain of what happened before there were cheap cards. Cheap cards came around eventually long after the cards had been unavailable and waaaaay overpriced. It was a relief for gamers that the bubble popped, but on balance, it was not a good thing that they had the bubble.
I would guess that computers used for gaming are turned off most of the time, whereas mining rigs are likely turned on and using much of the GPU at all times?
How much use does a GPU get in a gaming PC? A few hours per day? In crypto farms these things are running 24/7. Anyway, i still think its a bad move to tell people what they can and can not do with their hardware.
Gaming gives people enjoyment. They are not consuming other energy sources and to a certain degree, i think this is okay.
It also combines and supports advantages in science by making this more affordable for the rest.
You can still critizie it and there have been huge energy savings for idle GPUs which was not the case 10? years ago.
Nonetheless, one thing doesn't make the other thing better or worse. Doing bitcoin is supporting a system which requires mining to make transactions. It also doesn't free me but potentially people who benefit from this directly. Like money washing, moving their millions from left to right or other things. And while doing that, taking my graphicscard, i wanted to play with, to sit somehwere, running 24/7 for what?
What real problem solves bitcoin for me and most other people on the world?
>They are not consuming other energy sources and to a certain degree, i think this is okay.
Both are using energy, I don't know what the 'other' even r refers to.
>It also combines and supports advantages in science by making this more affordable for the rest.
.. Both of those apply for crypto. Better GPUs are developed because of usage, gaming usage doesn't necessarily give you more than crypto usage and cryptocurrencies are putting a lot of money into cryptography.
I might be a weirdo but playing RDD2 for 85 hours was very entertaining and while i played RDD2, i did not travel or do something else resourceintenvise.
I have not had this amount of return on investment when doing something with bitcoin.
It was more like 'huh? how much is the transaction fee now?', ah why is this market seized? Oh i found a little bit of bitcoin here, i totally forgot.
And while i play 85h over a span of 4 weeks or so, your mining rig already consumed 590h more energy than i did.
I said 'to a certain degree, i think this is okay' and i also think that for bitcoin, this is not the case.
> is there any inherent upside to burning tons of energy for games vs burning it for a cryptocurrency?
Not an upside, but probably that the mining is mostly done by 1 person (or org) who use a lot of energy. A gaming rig probably won't be running 24x7 as max utilization.
Other people in this thread bring up great counterpoints. Namely, gaming is not nearly as intensive as mining and whatever value gaming brings to society is more than that of mining.
I'd also say that gaming at least occupies the human who is playing. Mining would be running in the background: the human would still be consuming energy doing something else. So one doesn't replace the other. Therefore using the inefficiency of one to justify the inefficiency of the other is not a valid argument.
One last observation: you're assuming that all these video cards would still have been produced. Maybe production of hardware has gone up to serve the mining demand.
I use my card during just during winter for mining and calculated the energy for mining just under "heating my apartment". So the coins are just some bonus from heating. So in future I couldn't do that with an Nvidia card.
There are plenty of good answers addressing the reality that gaming doesn't use anywhere near the same aggregate energy as mining, but to me, the more important difference is that currency of some kind is critical infrastructure. Civilization collapses without it. Right now, civilization won't collapse without cryptocurrency because we can just continue using government-issued currency, but in the crypto endgame where it is actually supposed to replace government-issued legal tender, we've suddenly become reliant on a tremendously inefficient power grid hog. Imagine something like what just happened in Texas happens on a larger scale. Well, now not only do you not have lights at night, but you can't spend money either.
The basic fabric of society is much less reliant on the ability of citizens to play digital games, so losing that ability wouldn't matter much.
Running with the Nascar/Formula One analogy, those things aren't such a huge deal because they're niche applications of vehicle technology. It would be far more disastrous for society at large, on the other hand, if we decided to make all commuter vehicles get 2 MPG and require high octane fuel and new tires every two hours.
Similarly, cryptocurrency is (relatively) harmless right now because it is a niche speculative commodity. It would be a global disaster if it ever became widely used as actual currency. Cryptocurrencies are like castles, gaudy but interesting spectacles when only a few lords build them, but the world would quickly run out of rocks if we made them the universal unit of housing.
I would argue that gaming creates a lot more value (joy) per tflophour(?) compared to mining. How many thousands of hours of gaming equals one btc transaction?
How do you compare the value of gaming to trustless decentralised gambling or prediction markets? What about to NFT based trading cards? How about decentralised exchanges?
Not even the most dedicated gamer is going to burn the amount of electricity a miner does with the same hardware. That's not even considering the number of cards miners purchase...
The number of games that actually cause a modern GPU to hit its maximum TDP is also much lower than you'd think. If vsync is turned on, even a 3060 will trivially hit 60fps on older/simpler games, and modern ones unless the settings are turned way up. Games like Doom Eternal and Counter Strike Go are incredibly well optimized and can easily hit hundreds of frames per second on modern hardware. Likewise if it's a simple 2D game or more balanced title that leans on the CPU (unless you're using complex shader packs, popular games like Minecraft do a lot of their work on the CPU)
> After all, if all these cards weren't used for mining but for gaming, wouldn't the environmental outcome be the same?
No. Really, no.
Mining hardware is just trash when new hardware is released, pretty much immediately.
OTOH, gaming hardware very often has a second life in the form of used hardware and lower-specced gaming computers. Don't be fooled by YouTube influencers, most of the real people won't be buying the latest and greatest hardware and drop their previous hardware. IIRC, valve stats showed that 1080-era cards are still very common and in use.
And by the way: pretty much no one is running games 24/7. Mining hardware OTOH are specifically meant to be kept going 24/7.
> After all, if all these cards weren't used for mining but for gaming, wouldn't the environmental outcome be the same?
I would assume miners keep these cards pegged at 100% 24/7. By comparison very few games actually push graphics cards to 100% utilization, especially when you're talking about top of the line cards like the nVidia 3000 series, and on top of that very few people play games for more than a couple hours per day. A card in the hands of a gamer probably consumes only a small fraction of the power that same card would in the hands of miners.
To play one game for one day with your GPU - and I'm just guessing here - probably uses a small fraction of the power that mining 0.0001% of a bitcoin does, because blockchain is not the same thing as graphics.
Besides, it's also the transacting that uses a lot of energy, not just the mining.
I think it's a pretty lazy and complacent argument, to the extent energy-constrained industries are the ones driving the adoption of renewable energy. The price of coal is much, much higher than solar, wind, hydro or geothermal.
Because when the crypto card is done it goes on the trash, when a gaming card is done it gets resold. People upgrading sell their old cards on FB market place etc
A lot of people are claiming this as a win for the non mining consumer but as the articles puts it:
> Nvidia does state that these GPUs "don't meet the specifications required of a GeForce GPU and, thus, don't impact the availability of GeForce GPUs to gamers." Frankly, that doesn't mean much. What does Nvidia do with a GPU that normally can't be sold as an RTX 3090? They bin it as a 3080, and GA102 chips that can't meet the 3080 requirements can end up in a future 3070 (or maybe a 3070 Ti). The same goes for the rest of the line. Make no mistake: These are GPUs that could have gone into a graphics card. Maybe not a reference 3060 Ti, 3070, 3080, or 3090, but we've seen TU104 chips in RTX 2060 cards, so anything is possible.
There's also seemingly little value for miners:
> Note that the 90HX lists an Ethereum hash rate of just 86MH/s and a 320W TGP. After a bit of tuning, an RTX 3080 can usually do 94MH/s at 250W or less, so these cards (at least out of the box) aren't any better.
> It gets worse as you go down the line, though. 50HX only does 45MH/s at 250W — that basically matches the tuned performance of the RTX 2060 Super through RTX 2080 Super, with a TGP that's still twice as high as what we measured. It's also half the speed of an RTX 3080 while potentially still using the same GPU (10GB VRAM). Or maybe it's a TU102 that couldn't work with 11 memory channels, so it's been binned with 10 channels. Either way, who's going to want this? 40HX at 36MH/s and 185W and 30HX at 26MH/s and 125W are equally questionable options.
Restrictions placed on the 3060 has also confirmed to be more than just the drivers:
> According to Bryan Del Rizzo, director of global PR for GeForce, more things are working behind the driver.
> According to Mr. Del Rizzo: "It's not just a driver thing. There is a secure handshake between the driver, the RTX 3060 silicon, and the BIOS (firmware) that prevents removal of the hash rate limiter." This means that essentially, NVIDIA can find any way to cripple the mining hash rate even if you didn't update your driver version. At the same time, according to Kopite7Kimi, we are possibly expecting to see NVIDIA relaunch its existing SKUs under a different ID, which would feature a built-in anti-crypto mining algorithm. What the company does remains to be seen.
> > These are GPUs that could have gone into a graphics card.
That's an interesting take. After the discussion of the announcement earlier this week, I presumed that this was a way for NVIDIA to improve yields by rescuing bad chips from the scrap heap. If a failed 3090/3080/3070 is going to a CMP instead of a 3060, that's not the win-win that they're pitching. If the alternative is a 2060, on the other hand, I'd still give them the benefit of the doubt.
They improve yields on their flagship 3090s by rescuing ones that fail as 3080, 3070 or the highest bin that they meet specification. They wouldn't become a 2060 as that uses a different architecture.
The manufacturing processes will also improve over time and at some point demand will dictate binning over yeild. Chips that are capable of better performance are restricted and sold as lower end cards. If yield is no longer an issue is simply more profitable.
Back in the day it wasn't uncommon for enthusiasts to take lower end cards and bios mod them for equivalent performance, but now features are often disabled in silicone.
If they can simply change the key needed to sign the bios or whatever for new cards coming off their production line they can continue to support cards that were sold before the change and have the limiter for cards made after the change. They could also make a tiny to change to the model number. I don't see how they could possibly be sued for doing so.
Nvidia is making the same money regardless if a miner or gamer buys their existing cards. They won't be losing any money of 100% of their production goes to gamers. As a game er if this reduces miners interest in the products I want I'd say it's good for me.
They can but I doubt the will as I, like the quoted article belive this would lead them to being sued over misleading marketing, just like the 970 3.5G controversy.
Maybe the model number would do it, but I'd imagine it would have to be pretty distinguishing.
It's a bios/firmware level limitation, and the latest GPUs all have firmware signing.
Not unbeatable, but it would require a much more involved hack than just modifying some drivers. Nvidia have painted a huge target on it, so we will see what happens.
Does that mean they're bringing down the xx60 card price too?
Rabid gamers who MUST have that 1440Hz refresh may be willing to pay anything for their space heaters, but there are people who play games but aren't willing to spend 1K (or 3K, how far have they gone these days?) on a video card.
In other words, is this done to increase sales volume or just nvidia's profit?
> is this done to increase sales volume or just nvidia’s profit?
What if it’s neither? Nvidia tried increasing production of 1070s and 1080s three years ago to meet bitcoin demand, and it bit them hard when the bitcoin bubble suddenly popped. They got stuck with an oversupply of cards right when Turing was launching, it ate into sales and the stock dropped in half overnight. Meanwhile their customer base of gamers were pissed because they couldn’t get gaming cards. What if they’re just hedging against another bubble popping, and trying to avoid getting killed by it, again?
The article clearly states that the hash rate will be limited if Ethereum mining algorithm is detected by the driver, not otherwise. Gaming use cases will remain the same
It's done so that when the crypto bubble bursts again, the market isn't flooded with used graphics cards.
In 2017 everyone was buying GTX 10XX cards, on one hand because they actually represented a good improvement over the last generation for a change, but mostly due to the crypto bubble. Which then burst at the beginning of 2018.
Then RTX 20XX came along, which actually represented a slight drop in price/performance if you don't count the raytracing/AI cores (which to this day hardly matter in gaming), while used GTX 10XX cards were still everywhere.
So now nvidia wants to make sure cards used for crypto go straight to the landfill after the current bubble bursts.
And at the same time, it lets them cut out the scalpers and sell straight to the miners who'll pay twice the price for a card - without the price-gouging upsetting gamers.
It’s because you need to load firmware on the gpu to make it run fast and that firmware is illegal to redistribute and impossible to reimplement since it’s signed. There is nothing the nouveau project can do short of finding an exploit in the cards crypto.
Unless they decide to add more algorithms after the fact, I'd be very surprised and outraged if this detected anything other than Dagger-Hashimoto via its memory access pattern.
This is why we need open source drivers. Now they are crippling cards for crypto mining, in the future there might be a whole list of other software features that require a paid unlock or even monthly subscription to use the full potential of your own hardware.
There are already a bunch of other cases like this. For example, the Windows GeForce drivers have been for years (if not decades) loaded with a bunch of shady OpenGL heuristics to punish you if you do stuff that looks like enterprise/industrial rendering (AutoCAD, etc).
For an ordinary game developer this means cheap operations like texture readback will go from taking 1ms to 16ms (bumping your framerate down). The same stuff is consistently fast in Direct3D because enterprise/industrial workloads don't use it.
NVENC for hardware video encoding is also artificially limited on consumer cards, which forces you to buy quadros for scenarios where you want to do multi streaming.
> According to Bryan Del Rizzo, director of global PR for GeForce, more things are working behind the driver.
> According to Mr. Del Rizzo: "It's not just a driver thing. There is a secure handshake between the driver, the RTX 3060 silicon, and the BIOS (firmware) that prevents removal of the hash rate limiter."
It's a little weird because they released the 3060 TI before the regular 3060, so when they say they're launching the 3060... It's just confusing. I feel like they should have given it its own identifier to help with that confusion, or launched them in the normal order. Base model, then the upgraded one.
I am convinced Nvidia is deeply involved in the crypto space more so than they care to admit. I can’t wait until Apple matches or surpasses Nvidia so we don’t have to listen to Jensen’s lies quarter after quarter about how insignificant crypto mining is to their business.
Nvidia isn't using TSMC for any of their current products, they're using Samsung. They may switch to TSMC for their more high-end cards, but I don't think that is the case right now.
Right, so what’s their volume on a100s and are they a favored mining tool? If so, I think that might be the disconnect between what people think nvidia is focused on, and what they are focused on.
Plus, I’ll point out that a lot of information can be hidden by public companies. Amazon was pretty quiet about it’s growth in sales of AWS as a dominant force for a number of years.
> Right, so what’s their volume on a100s and are they a favored mining tool?
I don't think there are numbers available for A100. NVIDIA discloses in Earnings how much it makes from each sector: gaming ~50% and data center (A100) ~40% or so. The data-center GPUs are way more expensive, but one can probably estimate the numbers from there.
I'm not sure why would that be relevant, but if someone wants to get some factual information, that's probably the only way to go.
Regarding CMP chips, NVIDIA says that CMP chips are GeForce chips that don't pass QA for gaming, and that GeForce chips can be used for CMP but are artificially crippled to prevent cryptominers from using them, so that they don't No idea.
take GeForce cards out of gamers (which makes sense since gamers are a long term customer base, and cryptominers are usually a ~<1year trend).
A lot of people are saying that they don't believe this information (e.g. LTT), but AFAIK nobody has provided any facts yet.
I guess we'll have to wait and see how easy / hard it is to "unlock" GeForce products for cryptomining, and for some independent testing of how good the CMP products are for gaming.
The last time this happened, LTT managed to retrofit CMP products for gaming, so their skepticism isn't unfounded.
But TBH the main argument of LTT analysis is that NVIDIA cares about its shareholders, and is only doing this differentiation to increase their profits.
I am having a hard time following the argument that suggests that this behavior is "outrageous". That's what you would expect of any publicly traded company. That's also what all NVIDIA shareholders would expect..
Yeah this is pretty clearly price differentiation. nVidia want trying to sell the same product to gamers and to miners but miners can afford much higher prices. So they create an expensive mining card, and then try and stop miners from using the cheap graphics cards.
They've already done it for AI. This is just doing the same thing for crypto.
This is just par for the course on closed down hardware. The manufacturer decides how you use it, that's how it has always been. I'm actually happy about this news, I hope it raises awareness for the need of open hardware, it's not like I'd ever buy one of these cards when competitors at least have open source drivers. Lack of hardware freedom generating electronic waste isn't unheard of either.
I’m guessing they are going to try and raise the prices and create a new market segment. Just as they have separated datacenter GPUs, increased the price massively (coming up to a factor of 10x) and then tried to use a stick to prevent the use of desktop GPUs for datacenter work (eg. the licensing change).
Unfortunately (for their bottom line) it probably won’t work, if the restriction is in the driver then someone like myself (or many other people in HN) will have it patched in a number of hours. If it’s in the firmware it’ll take longer, but it’ll be done. Unless they’ve actually fused out integer units on the consumer cards, that would be bold.
> According to Bryan Del Rizzo, director of global PR for GeForce, more things are working behind the driver.
> According to Mr. Del Rizzo: "It's not just a driver thing. There is a secure handshake between the driver, the RTX 3060 silicon, and the BIOS (firmware) that prevents removal of the hash rate limiter."
I think it's important to look at as the article does - the other way around, the CMP lineup will be excellent for their bottom line as miners, without competition from gamers and others can buy up cards that won't contribute to the seccond hand market when the mining boom dies down.
As suggested in the article, silicone would have been allocated to mid and low end cards is now allocated to mining only cards that have little resale value and won't hurt their bottom line when they release the 4XXX series.
I can imagine that nowadays you can protect your hardware with encryption chip that unlock firmware with public/private keys. To break it you will have to remove the chip.
The CMP version is a straight-to-landfill product. As soon as it stops being profitable for mining it will have zero resale value, and become e-waste.