Yes the gtx 1050 Ti mobile is coming to laptops and its got better benchmarks then the PC version. Now thats freaking nasty for a laptop. A card that uses 75w max power consumption and running 10% faster then the gtx 970m. And with a price that you just can't beat. With the rx 460 using 104w of power its just not gonna compete with these numbers.
Are you onto something or just on something?
Comments
This is based on overclocking and a lot more o/c than the other machines.So what happens under serious duress when o/c that much,i would say it will be very unstable and/or need some serious cooling.
Prices,i think too early but my guess is that it is new and it is a laptop,so it will be over priced by a lot.
If i had to guess shelf price here in Canada,i would say likely around 2800-3200 for just above average that is simply o/c a ton.
As well the 4gb memory sounds like it won't be as good as advertised.I absolutely never care about benchmarks unless i see moving in game footage.
Never forget 3 mile Island and never trust a government official or company spokesman.
At least here in the US, the largest battery you usually see installed in a laptop is 100Wh (the current FAA limit for Lithium Ion to be brought Carry-On - you can have bigger batteries, but then it has to be checked).
So I don't do math so well, I'll let other smarter people figure out why that's an interesting fact.
Also, not saying that the difference between a 75W and a 104W GPU is not meaningless, except that, I am, because... seriously, for my own use, even 75W is just way to much to consider putting into a laptop, considering you still have ~everything else~ that needs power on top of that.
Mostly, I see the price bundled in with the rest of the laptop, so I don't really know what the GPU cost by itself.
-Unconstitutional laws aren't laws.-
We were really hoping that AMD was supplying us with solid laptop gpu's but they just didn't get it this time.
--John Ruskin
Now, they could and probably either have or will make a lower clocked, lower wattage version of the same chip that limits itself to a more laptop-friendly 35 W or so. But the 75 W laptop card is supposed to be a lower clocked version of the next chip up. A GeForce GTX 1060 with its clock speed and voltage suitably reduced could probably offer a lot more performance in that same 75 W envelope. That's basically what both AMD and Nvidia have done with their laptop GPUs for many generations now.
This new batch of desktop-class GPUs go by their CUDA core-matching desktop names, but the significantly increased clock rates of the GTX 1050 Ti and the reduced ROP count of the 1050 almost make these completely different GPUs (in terms of potential performance) compared to their desktop predecessors. However, until we review a laptop with these new graphics and pit it against a desktop with a similar configuration, we'll have to reserve judgment.
Annandtech (4th January) seemed to be at a bit of a loss to explain the apparent differences as well.
http://www.tomshardware.co.uk/nvidia-gtx-1050-1050ti-mobile,news-54495.html
http://www.anandtech.com/show/10980/nvidia-launches-geforce-gtx-1050-ti-gtx-1050-for-laptops
Hence the links and current users scores etc. may not be applicable. The proof of the pudding will be in the eating - or not.
It ain't the power consumption that is the real problem though but the heat. It is incredible hard to get good cooling while still keep the laptop small and portable. Until the nano technology becomes better we will be stuck with under-performing laptops and most gamers will use a desktop no matter how practical it would be to be able to remove your entire computer from your home docking station (usually that means keyboard, mouse and real large screen you jack in at home nowadays) and bring it with you wherever you go.
It's conservation of energy.
Your point was probably that cooling is more of a problem than battery drain, and I'm not arguing against that.
This isn't the actual federal regulation, but it's an FAA handout derived from it, so ... close enough for all but the lawyers I suppose.
Most laptops don't have a battery anywhere near 100Wh. The new MPB (love it or loath it), for instance, only has 54Wh. The longest running laptop that I could find just doing a quick and totally non-scientific Google search was a Lenovo Thinkpad X260, and it only had a (optional upgrade, not standard) 72Wh battery. A quick look through Sager gaming linup, and the biggest I could find there was a 76Wh battery (but I didn't look very deeply, I admit).
Is Pascal faster than GCN? Probably in most scenarios, but in a laptop, I'd say that's way down on the list of priorities.
And Polaris 11, we've only just seen it come out in the most recent Apple lineup - I don't think it's spread beyond that yet, but I could be mistaken.
They are more concerned with performance and heat. Realizing that they won't get anything as good as an rx 480 but can come close within reason. For example the gtx 1070 and 1080 for mobile are duplicate performance of the pc versions but the heat man will probably kill someone.
You could say I made this account on MMORPG just to send you that lol.
Integrated graphics offers some major advantages over a discrete card. No need to pass data to the GPU over a PCI Express bus saves cost and power, and can improve performance in some cases. Having only one hot spot in a laptop to cool rather than two makes cooling much simpler. Not needing a giant card (well, giant by laptop standards, anyway) for the GPU makes spacing and cooling simpler. Not needing two separate GPUs in the system because you want to shut down the discrete GPU when not gaming to save power but need a GPU running to show anything on the screen saves a lot of complexity--and avoids a lot of video driver problems.
Right now, the problem with integrated graphics for gaming is that it can't give enough performance. AMD could build a bigger APU with a lot more compute units (and has for consoles!), but it wouldn't work very well in laptops because it would be starved for memory bandwidth. Two channels of DDR4 just isn't enough to feed a big GPU. Stick HBM on package and that problem goes away, while all of the advantages of integrated graphics stay.
So they can make the apu's better but they are going to have to give them much better power usage in order to pull over 2 hours on a battery.