The flip side of NVidia's comparison is they expect next gen consoles to be more powerful than what? Low-end, mid-range, any PC with less than an RTX card?
It would have been way more potent though if they had felt "comfortable" comparing next-gen consoles to something with a "mid-range" NVidia card.
Not supposed to look beyond the headline obviously.
NVIDIA is obviously butt-hurt that all Next Gen Consoles are going to be using AMD chipsets once again!
It's a damn shame! If NVIDIA wasn't so overpriced and greedy, maybe their chips would be in the Next Gen Consoles!
Nvidia's problem is that AMD can bundle the CPU and GPU on the same die, with some secret sauce for the console vendor. There is no way a pure CPU or pure GPU vendor can compete.
I buy every console that comes out and I PC game. Every person I know who owns a console also has a PC that they game on. Not sure why it is an either or thing with some people. It is about the games and the games alone. Consoles and PC have great games. They both having gaming strengths and weaknesses. So I do both.
I do agree but I only buy the console when the price drops and I only have one console. For now it is PS4. I mean I have PS3 and an old Xbox360. The PS4 is actually gathering dust as even my adult kids don't use it.
It was MS (if I remember correctly) that started this by saying that the new XBox would be faster than current PC's. Of course NVIDIA is going to fire back, considering they have nothing to loose by trashing the next wave of consoles.
I don't play consoles because I can't stand the controllers. The mouse and keyboard are much easier than those console controllers. The day I can easily attach a mouse and keyboard to a console is the day I might look into playing one... if it's better than my PC of course.
I have a PS4, Xbox 360 and PC gaming computer. Most of the neighborhood kids hang here as well as my nephews and nieces. They all play on PS4 or Xbox 360, especially the boys playing sports and war type game. They also play games on their phones but usually when they can't play on the consoles. I volunteer a few times a week at a school at the problem now is that most families can't afford a computer than plays today's games. In speaking with parents, if they have a pc its genrally an older one to run things like office and other academic and everyday things. They also feel like spending a couple hundred bucks on a system that can also act a Blu ray or dvd player is a money better spent than getting a new pc/laptop for the same price when the pc they can get at the same price will not play many of the games the kids want to play.
So Nvidia see the next gen consoles as a threat. Maybe it's the Ray Tracing that AMD are developing for the consoles that could end up in PC's. APU's could run PC's in the future ? Nvidia have no APU's or CPU's. Something has rattled their cage.
So Nvidia see the next gen consoles as a threat. Maybe it's the Ray Tracing that AMD are developing for the consoles that could end up in PC's. APU's could run PC's in the future ? Nvidia have no APU's or CPU's. Something has rattled their cage.
I think it's just a ploy to keep stable stock prices. Show the shareholders that even though they didn't win a major console contract in the next gen race (Nintendo obviously doesn't count there, oddly enough), that nVidia is still the horse to bet on.
The whole "Raytracing in Consoles" thing has hit some headlines lately, nVidia wants to just make sure that doesn't spook their investors.
So Nvidia see the next gen consoles as a threat. Maybe it's the Ray Tracing that AMD are developing for the consoles that could end up in PC's. APU's could run PC's in the future ? Nvidia have no APU's or CPU's. Something has rattled their cage.
Nvidia does have APUs. They're called "Tegra". By far the most prominent use of Tegra is in the Nintendo Switch, though some cars also use them. Nvidia's APUs use off the shelf ARM cores for their CPU, so while they don't have their own CPU architecture, Nvidia can license ARM cores just like many other vendors do.
Every Max Q version ( nerfed badly ) of laptop I've used has been a disappointment.
So I doubt this claim.
Is that all zero of them, or all one of them? The Max-Q branding hasn't been around for very long.
You are mistaken. Dell has sold Max Q Laptops for awhile.
Edit. Example. There were 1060 Max Q Laptops several months ago.
People rarely have to buy more than one laptop in the time Max Q has been available, yet you claim experience from many of them. Quizzical's question was valid.
Every Max Q version ( nerfed badly ) of laptop I've used has been a disappointment.
So I doubt this claim.
Is that all zero of them, or all one of them? The Max-Q branding hasn't been around for very long.
You are mistaken. Dell has sold Max Q Laptops for awhile.
Edit. Example. There were 1060 Max Q Laptops several months ago.
And how many gaming laptops have you needed to buy in those several months?
Okay, I'll be a little nicer than that. Nvidia announced the Max-Q branding at Computex 2017, which is about 2 1/2 years ago. If you've needed to replace several gaming laptops for personal use in that time, you're doing something wrong.
Actually, now that I look at it, Max-Q is a terrible idea. I had thought that it was just a lower power version of desktop cards, like the laptop vendors have been doing for more than a decade. Apparently it's not. It's an effort at making extra thin gaming laptops. If a gaming laptop is fundamentally an effort at putting too much heat into too little space, Max-Q's approach is to take that to the greatest extreme that they can.
Go here, for example, and look at the weight options:
That's nearly a bimodal distribution, which makes no engineering sense at all. The under 6 pounds is the GeForce RTX 2080 Max-Q laptops with a whole lot of heat in not very much space. It will be handled to some degree by throttling performance way back when necessary (i.e., when gaming). But I'd advise taking the word "laptop" literally under gaming loads. You might not want to touch the built-in keyboard while gaming, either. Even if it seems okay when brand new, it won't after it gets clogged with dust.
The 10+ pounds category is the real, high-end gaming laptops. Asus and MSI did what they had to in order to to keep high end hardware properly cooled. That can get you performance comparable to a gaming desktop, though it won't necessarily weigh much less than a small form factor gaming desktop.
Actually, now that I look at it, Max-Q is a terrible idea. I had thought that it was just a lower power version of desktop cards, like the laptop vendors have been doing for more than a decade. Apparently it's not. It's an effort at making extra thin gaming laptops. If a gaming laptop is fundamentally an effort at putting too much heat into too little space, Max-Q's approach is to take that to the greatest extreme that they can.
Go here, for example, and look at the weight options:
That's nearly a bimodal distribution, which makes no engineering sense at all. The under 6 pounds is the GeForce RTX 2080 Max-Q laptops with a whole lot of heat in not very much space. It will be handled to some degree by throttling performance way back when necessary (i.e., when gaming). But I'd advise taking the word "laptop" literally under gaming loads. You might not want to touch the built-in keyboard while gaming, either. Even if it seems okay when brand new, it won't after it gets clogged with dust.
The 10+ pounds category is the real, high-end gaming laptops. Asus and MSI did what they had to in order to to keep high end hardware properly cooled. That can get you performance comparable to a gaming desktop, though it won't necessarily weigh much less than a small form factor gaming desktop.
Read the reviews on the Max-Q laptops, most of them complain about heat. One said
-needed an undervolt if you can do it, as even with full fan speed it would hit 90 celsius and throttle. (which is expected for its impressive size)
It makes absolutely no sense to drop that much money on something like this for gaming and have it just throttle due to heat.
I grew up with consoles first owning a Telestar in the late 70s. Today I am purely a PC gamer. Somewhere around the 360 and XBOX1 the limitations of consoles became aware to me. Once I learned PC (WASD was a bitch) I never touched another console.
Honestly not really surprising. The big edge consoles have is that they are closer to metal (lower level apis) from the get go and can make better use of the hardware inside the console than your average pc can. Consoles are almost always worse hardware wise than even a laptop, but can always make better use of it.
he jelly Sony and MS always choose AMD. If NVIDIA wasn't so greedy they would have a chance at working with consoles. Sure they worked with Nintendo on the Switch... with a tablet chip. If they wanted to have all consoles using tablet chips then i'm glad they went with AMD again.
It's about keeping the consoles as low cost as possible while squeezing as much power as they can from that small budget. Jen Hsun doesn't see that, he is too greedy.
I think AMD is getting the consoles because they can offer a package with good CPU and good integrated GPU. It's not about NVidia being greedy, it's about AMD having the best product.
Well I mean Nvidia literally can't do an integrated CPU/GPU unless it would be ARM based. Intel and AMD refuse to license x86/x86_64 out to anyone and about the only way Nvidia could get the ability to make them is if they were to buy VIA and somehow keep the licensing allowed to VIA and then up CPU design hardcore based off that.... If console makers are looking for a more APU like design it'd literally be impossible for anyone other than AMD to offer it to them. Intel has similar products, but they are much lower on the graphics end.
As far as max-q it is silly. Get a gaming laptop that takes being a gaming laptop seriously if you are going to want it for gaming. The idea of thinner, lighter, more sleek shit is just asinine. Thin laptops don't have good cooling period and would suffer with anything other than an igpu from intel in all honesty.
he jelly Sony and MS always choose AMD. If NVIDIA wasn't so greedy they would have a chance at working with consoles. Sure they worked with Nintendo on the Switch... with a tablet chip. If they wanted to have all consoles using tablet chips then i'm glad they went with AMD again.
It's about keeping the consoles as low cost as possible while squeezing as much power as they can from that small budget. Jen Hsun doesn't see that, he is too greedy.
I think AMD is getting the consoles because they can offer a package with good CPU and good integrated GPU. It's not about NVidia being greedy, it's about AMD having the best product.
Well I mean Nvidia literally can't do an integrated CPU/GPU unless it would be ARM based. Intel and AMD refuse to license x86/x86_64 out to anyone and about the only way Nvidia could get the ability to make them is if they were to buy VIA and somehow keep the licensing allowed to VIA and then up CPU design hardcore based off that.... If console makers are looking for a more APU like design it'd literally be impossible for anyone other than AMD to offer it to them. Intel has similar products, but they are much lower on the graphics end.
ARM Cortex A77 cores would be plenty fast enough for console use. Switching from x86 to ARM (or vice versa) could be troublesome for backward compatibility. But ARM cores not being fast enough in themselves isn't a problem anymore.
Let's not forget that the PS4, PS4 Pro, Xbox One, and Xbox One X all used AMD Jaguar cores that offered low clock speeds (around 2 GHz) and low IPC. They went that route for the sake of small die size and good energy efficiency. Which makes them kind of like what you'd expect from ARM cores.
he jelly Sony and MS always choose AMD. If NVIDIA wasn't so greedy they would have a chance at working with consoles. Sure they worked with Nintendo on the Switch... with a tablet chip. If they wanted to have all consoles using tablet chips then i'm glad they went with AMD again.
It's about keeping the consoles as low cost as possible while squeezing as much power as they can from that small budget. Jen Hsun doesn't see that, he is too greedy.
I think AMD is getting the consoles because they can offer a package with good CPU and good integrated GPU. It's not about NVidia being greedy, it's about AMD having the best product.
Well I mean Nvidia literally can't do an integrated CPU/GPU unless it would be ARM based. Intel and AMD refuse to license x86/x86_64 out to anyone and about the only way Nvidia could get the ability to make them is if they were to buy VIA and somehow keep the licensing allowed to VIA and then up CPU design hardcore based off that.... If console makers are looking for a more APU like design it'd literally be impossible for anyone other than AMD to offer it to them. Intel has similar products, but they are much lower on the graphics end.
ARM Cortex A77 cores would be plenty fast enough for console use. Switching from x86 to ARM (or vice versa) could be troublesome for backward compatibility. But ARM cores not being fast enough in themselves isn't a problem anymore.
Let's not forget that the PS4, PS4 Pro, Xbox One, and Xbox One X all used AMD Jaguar cores that offered low clock speeds (around 2 GHz) and low IPC. They went that route for the sake of small die size and good energy efficiency. Which makes them kind of like what you'd expect from ARM cores.
Microsoft wants to use x86 for easy porting and compatibility between XBox and Windows. Sony could choose to be the odd one out and go with ARM, but after how their foray into Cell processors turned out I don't think they want to choose different architecture unless it's significantly better.
If AMD had gone greedy or done something stupid, both Microsoft and Sony could have gone either with NVidia/Intel or NVidia/ARM solution, but AMD's product was the best and AMD would have needed to do something stupid to not get this console generation.
Every Max Q version ( nerfed badly ) of laptop I've used has been a disappointment.
So I doubt this claim.
Is that all zero of them, or all one of them? The Max-Q branding hasn't been around for very long.
You are mistaken. Dell has sold Max Q Laptops for awhile.
Edit. Example. There were 1060 Max Q Laptops several months ago.
And how many gaming laptops have you needed to buy in those several months?
Okay, I'll be a little nicer than that. Nvidia announced the Max-Q branding at Computex 2017, which is about 2 1/2 years ago. If you've needed to replace several gaming laptops for personal use in that time, you're doing something wrong.
Who says I bought them for myself? I think you should get your foot out of your mouth instead of trying to deflect away from the fact you were seriously very wrong about Max Q.
Actually, now that I look at it, Max-Q is a terrible idea. I had thought that it was just a lower power version of desktop cards, like the laptop vendors have been doing for more than a decade. Apparently it's not. It's an effort at making extra thin gaming laptops. If a gaming laptop is fundamentally an effort at putting too much heat into too little space, Max-Q's approach is to take that to the greatest extreme that they can.
Go here, for example, and look at the weight options:
That's nearly a bimodal distribution, which makes no engineering sense at all. The under 6 pounds is the GeForce RTX 2080 Max-Q laptops with a whole lot of heat in not very much space. It will be handled to some degree by throttling performance way back when necessary (i.e., when gaming). But I'd advise taking the word "laptop" literally under gaming loads. You might not want to touch the built-in keyboard while gaming, either. Even if it seems okay when brand new, it won't after it gets clogged with dust.
The 10+ pounds category is the real, high-end gaming laptops. Asus and MSI did what they had to in order to to keep high end hardware properly cooled. That can get you performance comparable to a gaming desktop, though it won't necessarily weigh much less than a small form factor gaming desktop.
for about 3 months now and I absolutely love it. I play RDR2 at 1080p with high settings at 60 frames. No problems with overheating or throttling. Cleaning any dust out is regular maintenance for any system. I've got desktops for work and home, I have this thing for portability when I travel. I'm typing this using it in my tiny closet of a hotel room in Shinjuku Japan, coincidentally Square Enix is right across the street which I thought was pretty cool
Comments
It's a damn shame! If NVIDIA wasn't so overpriced and greedy, maybe their chips would be in the Next Gen Consoles!
It would have been way more potent though if they had felt "comfortable" comparing next-gen consoles to something with a "mid-range" NVidia card.
Not supposed to look beyond the headline obviously.
------------
2024: 47 years on the Net.
I feel like nVidia is 90% marketing and 10% actual product at this point.
Fishing on Gilgamesh since 2013
Fishing on Bronzebeard since 2005
Fishing in RL since 1992
Born with a fishing rod in my hand in 1979
The whole "Raytracing in Consoles" thing has hit some headlines lately, nVidia wants to just make sure that doesn't spook their investors.
You are mistaken. Dell has sold Max Q Laptops for awhile.
Edit. Example. There were 1060 Max Q Laptops several months ago.
Okay, I'll be a little nicer than that. Nvidia announced the Max-Q branding at Computex 2017, which is about 2 1/2 years ago. If you've needed to replace several gaming laptops for personal use in that time, you're doing something wrong.
Go here, for example, and look at the weight options:
https://www.newegg.com/p/pl?N=100167732 4814 601327079
I'll copy the relevant data here as of this posting. The weight ranges, together with the number of laptops in that range:
Under 6 pounds: 302 laptops
6-9.9 pounds: 10 laptops
10+ pounds: 73 laptops
That's nearly a bimodal distribution, which makes no engineering sense at all. The under 6 pounds is the GeForce RTX 2080 Max-Q laptops with a whole lot of heat in not very much space. It will be handled to some degree by throttling performance way back when necessary (i.e., when gaming). But I'd advise taking the word "laptop" literally under gaming loads. You might not want to touch the built-in keyboard while gaming, either. Even if it seems okay when brand new, it won't after it gets clogged with dust.
The 10+ pounds category is the real, high-end gaming laptops. Asus and MSI did what they had to in order to to keep high end hardware properly cooled. That can get you performance comparable to a gaming desktop, though it won't necessarily weigh much less than a small form factor gaming desktop.
-needed an undervolt if you can do it, as even with full fan speed it would hit 90 celsius and throttle. (which is expected for its impressive size)
It makes absolutely no sense to drop that much money on something like this for gaming and have it just throttle due to heat.
Well I mean Nvidia literally can't do an integrated CPU/GPU unless it would be ARM based. Intel and AMD refuse to license x86/x86_64 out to anyone and about the only way Nvidia could get the ability to make them is if they were to buy VIA and somehow keep the licensing allowed to VIA and then up CPU design hardcore based off that.... If console makers are looking for a more APU like design it'd literally be impossible for anyone other than AMD to offer it to them. Intel has similar products, but they are much lower on the graphics end.
Let's not forget that the PS4, PS4 Pro, Xbox One, and Xbox One X all used AMD Jaguar cores that offered low clock speeds (around 2 GHz) and low IPC. They went that route for the sake of small die size and good energy efficiency. Which makes them kind of like what you'd expect from ARM cores.
If AMD had gone greedy or done something stupid, both Microsoft and Sony could have gone either with NVidia/Intel or NVidia/ARM solution, but AMD's product was the best and AMD would have needed to do something stupid to not get this console generation.
for about 3 months now and I absolutely love it. I play RDR2 at 1080p with high settings at 60 frames. No problems with overheating or throttling. Cleaning any dust out is regular maintenance for any system. I've got desktops for work and home, I have this thing for portability when I travel. I'm typing this using it in my tiny closet of a hotel room in Shinjuku Japan, coincidentally Square Enix is right across the street which I thought was pretty cool
"Be water my friend" - Bruce Lee