It looks like you're new here. If you want to get involved, click one of these buttons!
The Radeon R9 280X is a nice card for its MSRP of $300, at least assuming that you don't mind the relatively poor energy efficiency. New Egg has some in stock. The cheapest is $400.
Wait, what? $100 over MSRP is the cheapest, and prices go up from there? The card is effectively a rebranded Radeon HD 7970, which is just shy of two years old. And it's a card that AMD is still actively manufacturing and selling. The Radeon R9 290 and 290X are also far above MSRP, and that's even for the reference cards that you definitely don't want for gaming purposes. So what is going on here?
The answer is Bitcoin, Litecoin, and other cryptocurrencies. Modern Radeon cards have some sort of bit-shifting instruction that modern GeForce cards lack. For games, this basically doesn't matter; GPU code in games deals almost exclusively with floating point data, and bit-shifting floating point data will do very strange things. Bit-shifting can be useful as a high-performance way to multiply or divide integers by powers of two, so CPUs get some modest out of it in a lot of programs, and it's also very trivial to implement in hardware. (Think of it as multiplying by powers of 10 when doing arithmetic by hand: tacking some zeroes onto the end is much faster than going through 0 x 3 = 0 a bunch of times.)
But the hash functions as used by Bitcoin, Litecoin, and other such cryptocurrencies are likely to involve a ton of bit-shifting. That Radeon cards are massively faster at it than GeForce cards means that Radeon cards are massively superior at mining such currencies. Efficient bitcoin mining has largely moved to FPGAs and custom ASICs, but that hasn't yet happened for Litecoin and some other cryptocurrencies, leaving high end Radeon cards as the most effecient way to mine them.
So what happens if you can buy a $300 video card and expect to make $20 per day doing Litecoin mining with it? You buy as many as you can, and they sell out at $300. When vendors like New Egg realize that they sell out immediately at $300, they raise prices until the cards stop selling out immediately.
So why did this only happen recently? Because prices on Bitcoins and other such cryptocurrencies have spiked upwards. Bitcoins first broke $250 in November, and went over $1200 earlier this month. Other, newer digital currencies have seen similar spikes. That makes buying up high end Radeon cards and using them for mining currencies highly profitable, at least if you believe that prices will stay that high.
Which they won't, of course. It's hard to imagine a future in which dozens of currencies are used interchangeably even in a very small geographical area. I expect cryptocurrencies to never expand beyond a tiny niche, and likely disappear entirely. But even if they're here to stay, the world would most likely converge on one, rather than having people mine dozens of them, some of which may not be as secure as people think.
The reason prices stay so high is that if you believe every card you buy is going to be highly profitable, you buy a lot of them. Whereas a gamer rarely has much use for more than one card and almost never has much use for more than four, currency mining scales well to arbitrarily many cards. So people will buy dozens or hundreds at a time if they can. If AMD could bring twice as many cards to market, those might all sell out, too. And that's hard now that AMD has to divert a ton of production to Xbox Ones and PS4s.
So I don't know when prices will return to sanity, which we could loosely define as "not much greater than MSRP". At the latest, Radeon GPUs aren't anywhere near optimal chips for currency mining. FPGAs would probably be much better, and custom ASICs certainly so. Those will arrive for other currencies, just as they have for bitcoins, and that will dampen interest in using video cards for coin mining. But that could take a while.
Comments
Similar thing happened when the 7970 was first released, people where buying them like crazy for Bitcoins.
Like Quiz said, now you can buy specialized FPGA bitcoin processors that are faster and cheaper (and take less power) - but this time around it's Litecoin.
Yes, it will either crash, or something specialized, more efficient, and cheaper will come along.
I've always been an Nvidia man, since the original Geforce came out so many years ago. My last computer I had was a laptop with ATI graphics, and they're ok. However, ATI has really bad support. When a new game comes out, you have to wait like 6 months for the next driver to optimize for it. While Nvidia seems to optimize the next day or week. I think I'll stick with Nvidia from now on.
For instance, recently, when Assassin's Creed 4 launched some people were complaining about low FPS (10-25 fps) in the game. The VERY next day Nvidia came out with a driver that optimized the game and you would get 60+, while there are still people complaining about on Radeon cards the game runs horribly. Except for the first day, I could run AC4 on on new Nvidia laptop on maximum settings with no problems at all.
I never paid much attention before my ATI laptop about when you download an Nvidia driver it says 6% increase on this game, 4% increase on this game, etc. I guess that really makes a huge difference. I always had heard that ATI was bad about releasing drivers, but I guess I know better now
PS...what's bitcoin XD
If not for that, I'd have thought you hadn't even read the thread.
In 2001, the NSA released SHA-2. The idea is that you can give it any data input, the algorithm will do some computations, and it will give you a 64 byte output. As a hash function, it was basically designed with the goals that it should be very fast and efficient in computing the output, but it should be impossible in practice for anyone to find two different inputs that hash to the same output.
The point of this is as a way to check whether some new data matches some old data. A simple checksum would be able to determine whether data was randomly corrupted, but a secure hash function would be able to detect intentional manipulation, too. For example, suppose that you write and compile a program, and then compute the hash of it and store it. Later, if you suspect that someone may have modified your program, you can compute the hash of the "new" program and see if it matches the old. If the new hash value matches the old, you're guaranteed that the program is unmodified, while only having to store a 64 byte hash value rather than the entire old program.
This can be used to check for malware, for example, or to verify that a message wasn't intercepted and modified by someone malicious. It can also be used to store passwords: if all you have is hashes of passwords and people picked secure passwords (which often doesn't happen), a hacker who steals the database won't be able to recover the original passwords. When someone enters his password, you can hash it to see if it matches your known hash to verify that the password is correct.
In 2009, some Japanese guy (whose identity is disputed) decided that it would be a good idea to make a digital currency based on SHA-2. The idea is that you can do SHA-2, treat the output as an unsigned 512-bit integer, and if it's less than some threshold, you've just mined a new bitcoin. The threshold can be adjusted as time passes to make it harder to mine new bitcoins, so as to limit the rate at which new bitcoins are generated.
No one is publicly known to have found an algorithmically efficient way to mine bitcoins, as opposed to just trying a zillion different inputs and hoping that occasionally one gives a sufficiently low output to succeed by dumb luck. Because video cards can do massive amounts of computations in parallel, they're much better at mining bitcoins than CPUs that don't have anywhere near the same parallelism.
That no one yet has figured out a way to flood the market with bitcoins is good reason to believe that SHA-2 is pretty secure for its intended use. But it's actually not a very good algorithm to back a currency. The problem is its efficiency: in order to compute the hash, you only need a handful of operations and something on the order of tens or hundreds of bytes of cache.
That leaves it very open to making a custom chip that computes it vastly more efficiently than any GPU. The custom chip can simply not implement any operations that aren't needed and have the exact amount of cache needed, and then do that an enormous number of times in parallel.
For the purposes for which SHA-2 was originally intended, that's not a problem: if attacking a hash would otherwise have taken 1 quintillion times as long as we have before civilization collapses and your custom chip can do it in only 100 quadrillion times as long, so what? But if a new chip can mine bitcoins ten times as fast as anyone else, that's undesirable.
Efficiency gains from making a custom chip are going to be a problem with any cryptocurrency. For example, you're not going to use the video decode block of a GPU, so that's just wasted silicon. Encryption-related stuff tends to use integer and bitwise computations almost exclusively, while GPUs are very heavily optimized for floating-point computations because that's what games use almost exclusively. (The reason for this is that integer data types are commutative rings, which gives a lot of nice structure, while floating point data types are algebraically ugly: among other things, neither addition nor multiplication is associative.) But a more suitable hash algorithm could make it so that the proportional gains from a custom ASIC are much smaller than for bitcoins.
People have figured this out and proposed other cryptocurrencies backed by other hash functions. Litecoin is the most prominent, though there are many others. The danger of this is that if you use a weak algorithm, someone may be able to conjure up arbitrarily large amounts of the currency quite easily, and then the currency collapses. For the traditional hash functions that have been extensively studied by many people over many years, efficient computation was a desirable property, making them unsuitable for backing by a cryptocurrency.
Because comparing laptop graphics is a good indicator of anything.
Also, I don't think inflation is such a huge deal.
If you find out a way to make cryptocoins faster, that's what the sliding difficulty scale is for -- every time a coin is made, the next coin takes a little longer to make. That is more or less hardware independant - because even if you are using really really fast mining hardware, the next coin takes a little bit longer, and the sliding scale will catch up to whatever the fastest/most efficient hardware is in the market relatively quickly.
If inflation were the problem with bitcoin because of these fast and highly efficient FPGA and ASIC-based speciality miners, we would have seen the value of a bitcoin plummet as the market was flooded with them.
The value didn't plummet due to inflation - it's been more volatile based on political announcements and speculation moreso than anything to do with supply and demand. And it's seen record highs in terms of exchange rates lately.
Now, if someone could come up with an exploit to make an arbitary number of coins fast enough that it effectively bypasses the anti-inflation scale, then yes, that would crash the market quickly. But nothing has made that order of magnitude leap in efficiency in calculation yet.
One of my favorite youtube channels posted this a couple of days ago.
http://www.youtube.com/watch?v=ulg_AHBOIQU&list=UUoxcjq-8xIDTYp3uz647V5A
That was how the NSA could read all or data.
It seems likely that they have backdoor into sha-2 as well, and as such could destroy ***coins should they choose to.
And if it is deemed a financial threat, ( although I agree with Quizz, I kinda doubt it will be ), they will destroy it.
------------------------------------------------------------------------------------------
Originally posted by Jerek_
I wonder if you honestly even believe what you type, or if you live in a made up world of facts.
------------------------------------------------------------------------------------------