Why the hell is this thread still going, seems it has run its course?
OP 3/18/14 8:16:29 PM#1
Of course it ran its course quite some time ago, now it has looped back, started again and now has finished the course for the second time. We now await the third lap.
The internet will eventually take over the processing power of the PC and this will mean things will become more and more integrated . You probably will just have a smart TV with internet access but that internet will be so fast and powerful it will take over from your graphics card . I doubt we will see this in 5 years though but maybe within 10-20 years .
Once that happens hardware kit of gaming at home will become redundant . There wont be any PCs or Consoles but it wont matter because you will be able to do everything you want gaming wise without them .
The internet will eventually take over the processing power of the PC and this will mean things will become more and more integrated . You probably will just have a smart TV with internet access but that internet will be so fast and powerful it will take over from your graphics card . I doubt we will see this in 5 years though but maybe within 10-20 years .
Once that happens hardware kit of gaming at home will become redundant . There wont be any PCs or Consoles but it wont matter because you will be able to do everything you want gaming wise without them .
.... and here we go again.
The internet has no processing power. What you are actually talking about is moving the processing power from the users end and out to a remote main frame. This is exactly the reverse of the process we have seen over time. Oh there have been waves of revival for the central data room, thin clients was one, 'cloud' computing is another. But why when local processing is becoming cheaper and more ubiquitous would the world move back to the data center concept?
pc hardware is to good compared to anythign else to go away. If anything consoles should be rolled into pcs I dont think pcs will ever go away tablets are great for on teh go but who watns to game on a tablet when your at home I sure dont.
Many peoples first games were played on a console; no, not a PlayStation or an Xbox but something like a Commodores, Atari or ZX Spectrum. Some of these went on to adds screens but the first ones didn't and you plugged them into your TV.
Key point: monitor
Machines with screens and operating systems like CPM eventually coalesced around the IBM design using x86 and MSDOS. That standard has currently fractured. One of the big advantages that consoles have is a standard and an operating system.
Key point: game developers need a standard, in particular a standard operating system
Outside of the monitor (and optional the keyboard) the hardware is "irrelevant". As long as its powerful enough. Talk about tablets not having enough storage; a 200Gb microSD has just been announced with very fast transfer times (cameras are a key market). Graphics? All of NVidia's current cards are now Maxwell - lower power - and they have already announced that their next card will be 1.6 times faster than a 960. It won't be long before manufacturers could build a watch that would run todays games.
Far fetched? At CES 2015 Intel announced the Compute Stick. Looks like a USB stick. Quad core Atom processor - core M apparently in the pipeline (the processor that will be in the new MacBook with built in graphics), 32Gb storage with a micro-SD slot - so you could add another 200Gb, 2GB ram, wi-fi, bluetooth, usb and hdmi, Windows ($149) or Linux ($89) versions. Launch: this month.
Key point: hardware continues to get smaller, faster, more powerful, more energy efficient and cheaper
So a "PC" going forward?
a monitor, possibly a TV
an operating system - jury out on which one
optional: a Bluetooth keyboard, or controller etc.
and something to drive it - a Compute Stick, a tower, a phone, an MSI Gaming Block, a TV. As long as it is fast enough to run what we want it to do .. and tomorrows hardware will be more powerful than todays.
The extras - things like movement sensors and virtual reality - will play a part but what we are seeing on the hardware side is something called "convergence". Essentially the only thing that will differentiate watches, phones, tablets, laptops, TVs will be how big the monitor is and what sort of controller we use. The operating system - that is a battle yet to be decided. And for gamers the operating system matters: developers need to have platforms that will run their games. Hence they like consoles.
The guy on the radio show is a flipping loon. Can you imagine doing CAD work, video editing, or coding all on a tablet interface? I would quite literally want to shoot some one.... Not to mention all of the PC gaming industry, but from a pc professional aspect, him stating that PC's will be gone within 5 years is just asinine.
The internet will eventually take over the processing power of the PC and this will mean things will become more and more integrated . You probably will just have a smart TV with internet access but that internet will be so fast and powerful it will take over from your graphics card . I doubt we will see this in 5 years though but maybe within 10-20 years .
Once that happens hardware kit of gaming at home will become redundant . There wont be any PCs or Consoles but it wont matter because you will be able to do everything you want gaming wise without them .
.... and here we go again.
The internet has no processing power. What you are actually talking about is moving the processing power from the users end and out to a remote main frame. This is exactly the reverse of the process we have seen over time. Oh there have been waves of revival for the central data room, thin clients was one, 'cloud' computing is another. But why when local processing is becoming cheaper and more ubiquitous would the world move back to the data center concept?
Quite simple really. Big business is driving the new direction, because it saves money. IT is a commodity, a service that is bought from whichever vendor delivers the best value.
Whole IT departments are being scrapped and replaced with service contracts. The Fortune-500 list company that I used to work for did exactly that. Huge savings. Employee job descriptions are being replaced by vendor SLA's (Service Level Agreements).
And the "data center concept" is outmoded. The business doesn't care about the technicalities of where (or how) the data is processed or kept. That's something the vendor has to take care of, it's all detailed in the SLA's. All that's important is that everything works at an acceptable level 24/7. There's an army of lawyers ready to make sure that happens, or else...
Originally posted by ragz45 The guy on the radio show is a flipping loon. Can you imagine doing CAD work, video editing, or coding all on a tablet interface? I would quite literally want to shoot some one.... Not to mention all of the PC gaming industry, but from a pc professional aspect, him stating that PC's will be gone within 5 years is just asinine.
For CAD work, video editing - the monitor matters.
Coding though - processing power stuff - nah. Intel have announced 18 core processors, AMD a new set of more powerful processors this month, ARM a new set .... as I said only a matter of time before they could build a watch powerful enough.
First off, I am old enough to disclose that my first PC was the Tandy 1000 I bought at Radio Shack back in the late 80's. I also studied the architecture of the 8088 / 8086 chip. Since then....I have owned and built more PC's then I can remember. On the weekends I listen to a guy on the radio who does a Tech Show. He is very knowledgeable and I respect his opinion the majority of the time.
However, this last weekend he made a bold statement I completely disagree with. He said the PC (as we know it) will go the way of the DoDo bird and not be around in 5 years. He theorizes that everything will be going to "PC Tablets". Sure, the PC market has taken a sizable hit since the IPAD and other tablets have become very popular. However, with the multi-billion dollar video game industry as well as the major players in the GPU / Video cards, there is no way (in my opinion) the PC will die. Thoughts?
-Rig
The only thing about the tablets taking over theory is. I've read that tablets are actually not selling well atm. The smart phones are their biggest competitor. So, I highly doubt that PC will die off. Let alone from tablets.
The internet will eventually take over the processing power of the PC and this will mean things will become more and more integrated . You probably will just have a smart TV with internet access but that internet will be so fast and powerful it will take over from your graphics card . I doubt we will see this in 5 years though but maybe within 10-20 years .
Once that happens hardware kit of gaming at home will become redundant . There wont be any PCs or Consoles but it wont matter because you will be able to do everything you want gaming wise without them .
.... and here we go again.
The internet has no processing power. What you are actually talking about is moving the processing power from the users end and out to a remote main frame. This is exactly the reverse of the process we have seen over time. Oh there have been waves of revival for the central data room, thin clients was one, 'cloud' computing is another. But why when local processing is becoming cheaper and more ubiquitous would the world move back to the data center concept?
And how would you access that remote main frame ???? It wouldn't be through a telegraph machine thats for sure . Could it be god forbid through the internet . OK it may not be the internet itself that handles the actual processing but it would be how you would access it . You are just being a bit pedantic but I am sure thats not the the first time someone has said that to you .
Just remember when it happens in a decade or two this post you have made here and maybe you will look back at your younger self and laugh at just how lacking in foresight you were but lacking foresight really is part of the pedantic personality .
The internet will eventually take over the processing power of the PC and this will mean things will become more and more integrated . You probably will just have a smart TV with internet access but that internet will be so fast and powerful it will take over from your graphics card . I doubt we will see this in 5 years though but maybe within 10-20 years .
Once that happens hardware kit of gaming at home will become redundant . There wont be any PCs or Consoles but it wont matter because you will be able to do everything you want gaming wise without them .
.... and here we go again.
The internet has no processing power. What you are actually talking about is moving the processing power from the users end and out to a remote main frame. This is exactly the reverse of the process we have seen over time. Oh there have been waves of revival for the central data room, thin clients was one, 'cloud' computing is another. But why when local processing is becoming cheaper and more ubiquitous would the world move back to the data center concept?
Quite simple really. Big business is driving the new direction, because it saves money. IT is a commodity, a service that is bought from whichever vendor delivers the best value.
Whole IT departments are being scrapped and replaced with service contracts. The Fortune-500 list company that I used to work for did exactly that. Huge savings. Employee job descriptions are being replaced by vendor SLA's (Service Level Agreements).
And the "data center concept" is outmoded. The business doesn't care about the technicalities of where (or how) the data is processed or kept. That's something the vendor has to take care of, it's all detailed in the SLA's. All that's important is that everything works at an acceptable level 24/7. There's an army of lawyers ready to make sure that happens, or else...
.... and neither, it seems, do you. If the data is being processed centrally, and it is, then it is being processed in a data center. That the data center is owned and operated by a third party and not the contracting company does not change the fact that it is a data center, and probably a large one processing for a number of clients. Oh and BTW this approach has been a fashion before and will no doubt be one again. It is sort of a "tide goes in, tide goes out" sort of thing.
But that is an argument about where the data center is, my point was that for things that are not currently being done in a data center why would you move them back into one? The trend is for more processing power at the edge not moving processing power back to the center.
The internet will eventually take over the processing power of the PC and this will mean things will become more and more integrated . You probably will just have a smart TV with internet access but that internet will be so fast and powerful it will take over from your graphics card . I doubt we will see this in 5 years though but maybe within 10-20 years .
Once that happens hardware kit of gaming at home will become redundant . There wont be any PCs or Consoles but it wont matter because you will be able to do everything you want gaming wise without them .
.... and here we go again.
The internet has no processing power. What you are actually talking about is moving the processing power from the users end and out to a remote main frame. This is exactly the reverse of the process we have seen over time. Oh there have been waves of revival for the central data room, thin clients was one, 'cloud' computing is another. But why when local processing is becoming cheaper and more ubiquitous would the world move back to the data center concept?
And how would you access that remote main frame ???? It wouldn't be through a telegraph machine thats for sure . Could it be god forbid through the internet . OK it may not be the internet itself that handles the actual processing but it would be how you would access it . You are just being a bit pedantic but I am sure thats not the the first time someone has said that to you .
Just remember when it happens in a decade or two this post you have made here and maybe you will look back at your younger self and laugh at just how lacking in foresight you were but lacking foresight really is part of the pedantic personality .
If I am around and still commenting on this stuff in twenty years that will be an achievement in itself. When I look back at predictions I made about technology 20, 30 even 45 years ago what I see is far more success than failure. Some have called me pedantic before, generally when they have been sloppy about their phrasing and are embarrassed that someone called them on it.
The thing about your phrasing is suggesting that it is the 'internet' is to imply some new idea when in fact it is a very old idea: central data processing with dumb terminals. Now if spare processing capacity on my 'smart phone' and 'smart watch' and even my 'smart refrigerator' could supplement what I need for my PC or tablet now that might be an interesting and new idea.
I'll stick with my original argument. Smartphones and consoles will always serve their limited purposes, and add things as appropriate, because technology and engineering is purpose driven, not dominance driven.
As much as I hate analogies, screw it, I'm gonna make one here. From a gaming standpoint:
A PC is a house. It stores all your stuff and keeps it somewhat secure. Sure, you could rent a storage space(a cloud), but good luck with that.
A console is a mobile home. You can't store a whole lot, but you can live in it.
A smartphone is a car. It's the best way to "take your house with you". But it's still pretty limited with what you can do with it. And it will never be as effective as your house.
I'll stick with my original argument. Smartphones and consoles will always serve their limited purposes, and add things as appropriate, because technology and engineering is purpose driven, not dominance driven.
As much as I hate analogies, screw it, I'm gonna make one here. From a gaming standpoint:
A PC is a house. It stores all your stuff and keeps it somewhat secure. Sure, you could rent a storage space(a cloud), but good luck with that.
A console is a mobile home. You can't store a whole lot, but you can live in it.
A smartphone is a car. It's the best way to "take your house with you". But it's still pretty limited with what you can do with it. And it will never be as effective as your house.
Eh, I'm not anti analogy as most are on these forums as they're just meant to demonstrate similarities; they can all be knocked down with ease.
But I have to hand it to you, that was a sweet analogy. Well done, Robsolf.
"Mr. Rothstein, your people never will understand... the way it works out here. You're all just our guests. But you act like you're at home. Let me tell you something, partner. You ain't home. But that's where we're gonna send you if it harelips the governor." - Pat Webb
I remember buying my first PC. I wanted to make sure that it had enough HD space for Ultima 7. I think I ended up spenind 3K on a 486DX with a 100gig HDD..
After that I remember upgrading something almost yearly and then it was every two years. . usually when a game came out that I could absolutely even not run. ie. Wing Commander IV.
Today I upgrade because I want ultra settings or faster frame rate.
I have not bought a new PC in over 7 years. Do they consider Motherboards a a PC purchase? I am sure I have purchased a complete PC in parts during that time but a whole system . . nope.
To compare I am on my second tablet in 3 years.
A lot of people I know who thought tablets where the second coming now barely use them. . they have become e-readers and web browsers for the washroom and bed.
The thing is it is architecturally better to have the processing power as close to the user as possible. Thin clients are an old idea and do not lead to the savings promised.
With regards to the videos, lots of Microsoft hype. Lets wait until one is actually for sale, and I bet that in addition to the headgear you are going to need a pretty powerful PC locally to actually run the thing.
Oh and did you notice how many audience members where using Apple notebooks in that Windows presentation?
The thing is it is architecturally better to have the processing power as close to the user as possible. Thin clients are an old idea and do not lead to the savings promised.
With regards to the videos, lots of Microsoft hype. Lets wait until one is actually for sale, and I bet that in addition to the headgear you are going to need a pretty powerful PC locally to actually run the thing.
Oh and did you notice how many audience members where using Apple notebooks in that Windows presentation?
Maybe Microsoft did buy Apple through the back door ? :P
I cannot see the home PC being dead but I do see it evolving from being a desktop/laptop, I think desktop/laptop will be more for industry than home. We are getting closer to everything going through one device.
First off, I am old enough to disclose that my first PC was the Tandy 1000 I bought at Radio Shack back in the late 80's. I also studied the architecture of the 8088 / 8086 chip. Since then....I have owned and built more PC's then I can remember. On the weekends I listen to a guy on the radio who does a Tech Show. He is very knowledgeable and I respect his opinion the majority of the time.
However, this last weekend he made a bold statement I completely disagree with. He said the PC (as we know it) will go the way of the DoDo bird and not be around in 5 years. He theorizes that everything will be going to "PC Tablets". Sure, the PC market has taken a sizable hit since the IPAD and other tablets have become very popular. However, with the multi-billion dollar video game industry as well as the major players in the GPU / Video cards, there is no way (in my opinion) the PC will die. Thoughts?
-Rig
I work with AutoCad, Tekla, Inventor, StruCad, Solidworks.
Besides that I like to play 3d shooters and other 3d games.
There is no way at this moment that you can use those programs efficiently on a tablet.
People who use tablets fulltime are only playing flash games, watching Youtube, go to Facebook and other small stuff like that.
Working professionally with tablets and use 3d applications is a no go.
"going into arguments with idiots is a lost cause, it requires you to stoop down to their level and you can't win"
The internet will eventually take over the processing power of the PC and this will mean things will become more and more integrated . You probably will just have a smart TV with internet access but that internet will be so fast and powerful it will take over from your graphics card . I doubt we will see this in 5 years though but maybe within 10-20 years .
Once that happens hardware kit of gaming at home will become redundant . There wont be any PCs or Consoles but it wont matter because you will be able to do everything you want gaming wise without them .
.... and here we go again.
The internet has no processing power. What you are actually talking about is moving the processing power from the users end and out to a remote main frame. This is exactly the reverse of the process we have seen over time. Oh there have been waves of revival for the central data room, thin clients was one, 'cloud' computing is another. But why when local processing is becoming cheaper and more ubiquitous would the world move back to the data center concept?
Quite simple really. Big business is driving the new direction, because it saves money. IT is a commodity, a service that is bought from whichever vendor delivers the best value.
Whole IT departments are being scrapped and replaced with service contracts. The Fortune-500 list company that I used to work for did exactly that. Huge savings. Employee job descriptions are being replaced by vendor SLA's (Service Level Agreements).
And the "data center concept" is outmoded. The business doesn't care about the technicalities of where (or how) the data is processed or kept. That's something the vendor has to take care of, it's all detailed in the SLA's. All that's important is that everything works at an acceptable level 24/7. There's an army of lawyers ready to make sure that happens, or else...
.... and neither, it seems, do you. If the data is being processed centrally, and it is, then it is being processed in a data center. That the data center is owned and operated by a third party and not the contracting company does not change the fact that it is a data center, and probably a large one processing for a number of clients. Oh and BTW this approach has been a fashion before and will no doubt be one again. It is sort of a "tide goes in, tide goes out" sort of thing.
But that is an argument about where the data center is, my point was that for things that are not currently being done in a data center why would you move them back into one? The trend is for more processing power at the edge not moving processing power back to the center.
Yes, it certainly does appear to be cyclic, depending on advances in technology.
Four decades ago, companies running mainframes and minis used to send their data to "processing bureaus" on magnetic tapes to do things like payroll and billing runs.
As the tech advanced, the in-house machines became powerful enough to perform those activities themselves. The processing bureaus vanished.
Nowadays companies are once again using "specialist partners" for all kinds of functions like Debtors management and payroll processing, because networking tech advanced to the point where the data exchange can take place in near real time. It also means that the business no longer needs to employ the (expensive) specialist knowledge that is needed to administrate these increasingly complex functions.
"Big Data" is the current driver of change. Analysing vast volumes of data to find trends is the "hot topic". Cloud computing makes that a lot easier to achieve in a cost-efficient way, because of the ease of scaling.
With "Big Data", the balance of power is once again shifting away from the "edge" to the "center". Individual PC's are simply not able to deal with these data volumes when it comes to processing and storage/retrieval. Devices on the edge (like PC's) are requesting, displaying and manipulating results generated at the center (i.e. in the Cloud).
The vast majority of home users are using their PC's as entertainment or internet browsers. As those functions can increasingly be handled by smaller, more mobile devices, the need for PC's will be reduced substantially.
Many peoples first games were played on a console; no, not a PlayStation or an Xbox but something like a Commodores, Atari or ZX Spectrum. Some of these went on to adds screens but the first ones didn't and you plugged them into your TV.
I'm wondering why you would label a ZX Spectrum a console? To me it's a personnel home computer or PC. I had one as a kid (+3) and I could do more than just game on it. I could program, word process, print. It was widely regarded as a PC at the time.
First off, I am old enough to disclose that my first PC was the Tandy 1000 I bought at Radio Shack back in the late 80's. I also studied the architecture of the 8088 / 8086 chip. Since then....I have owned and built more PC's then I can remember. On the weekends I listen to a guy on the radio who does a Tech Show. He is very knowledgeable and I respect his opinion the majority of the time.
However, this last weekend he made a bold statement I completely disagree with. He said the PC (as we know it) will go the way of the DoDo bird and not be around in 5 years. He theorizes that everything will be going to "PC Tablets". Sure, the PC market has taken a sizable hit since the IPAD and other tablets have become very popular. However, with the multi-billion dollar video game industry as well as the major players in the GPU / Video cards, there is no way (in my opinion) the PC will die. Thoughts?
-Rig
Ey bro, would love to see you render those SpecFX 99% of hollywood movies use on an iPad
Comments
Why the hell is this thread still going, seems it has run its course?
Brenics ~ Just to point out I do believe Chris Roberts is going down as the man who cheated backers and took down crowdfunding for gaming.
Of course it ran its course quite some time ago, now it has looped back, started again and now has finished the course for the second time. We now await the third lap.
The internet will eventually take over the processing power of the PC and this will mean things will become more and more integrated . You probably will just have a smart TV with internet access but that internet will be so fast and powerful it will take over from your graphics card . I doubt we will see this in 5 years though but maybe within 10-20 years .
Once that happens hardware kit of gaming at home will become redundant . There wont be any PCs or Consoles but it wont matter because you will be able to do everything you want gaming wise without them .
.... and here we go again.
The internet has no processing power. What you are actually talking about is moving the processing power from the users end and out to a remote main frame. This is exactly the reverse of the process we have seen over time. Oh there have been waves of revival for the central data room, thin clients was one, 'cloud' computing is another. But why when local processing is becoming cheaper and more ubiquitous would the world move back to the data center concept?
What's a PC?
Many peoples first games were played on a console; no, not a PlayStation or an Xbox but something like a Commodores, Atari or ZX Spectrum. Some of these went on to adds screens but the first ones didn't and you plugged them into your TV.
Quite simple really. Big business is driving the new direction, because it saves money. IT is a commodity, a service that is bought from whichever vendor delivers the best value.
Whole IT departments are being scrapped and replaced with service contracts. The Fortune-500 list company that I used to work for did exactly that. Huge savings. Employee job descriptions are being replaced by vendor SLA's (Service Level Agreements).
And the "data center concept" is outmoded. The business doesn't care about the technicalities of where (or how) the data is processed or kept. That's something the vendor has to take care of, it's all detailed in the SLA's. All that's important is that everything works at an acceptable level 24/7. There's an army of lawyers ready to make sure that happens, or else...
For CAD work, video editing - the monitor matters.
Coding though - processing power stuff - nah. Intel have announced 18 core processors, AMD a new set of more powerful processors this month, ARM a new set .... as I said only a matter of time before they could build a watch powerful enough.
But the monitor matters.
The only thing about the tablets taking over theory is. I've read that tablets are actually not selling well atm. The smart phones are their biggest competitor. So, I highly doubt that PC will die off. Let alone from tablets.
And how would you access that remote main frame ???? It wouldn't be through a telegraph machine thats for sure . Could it be god forbid through the internet . OK it may not be the internet itself that handles the actual processing but it would be how you would access it . You are just being a bit pedantic but I am sure thats not the the first time someone has said that to you .
Just remember when it happens in a decade or two this post you have made here and maybe you will look back at your younger self and laugh at just how lacking in foresight you were but lacking foresight really is part of the pedantic personality .
.... and neither, it seems, do you. If the data is being processed centrally, and it is, then it is being processed in a data center. That the data center is owned and operated by a third party and not the contracting company does not change the fact that it is a data center, and probably a large one processing for a number of clients. Oh and BTW this approach has been a fashion before and will no doubt be one again. It is sort of a "tide goes in, tide goes out" sort of thing.
But that is an argument about where the data center is, my point was that for things that are not currently being done in a data center why would you move them back into one? The trend is for more processing power at the edge not moving processing power back to the center.
If I am around and still commenting on this stuff in twenty years that will be an achievement in itself. When I look back at predictions I made about technology 20, 30 even 45 years ago what I see is far more success than failure. Some have called me pedantic before, generally when they have been sloppy about their phrasing and are embarrassed that someone called them on it.
The thing about your phrasing is suggesting that it is the 'internet' is to imply some new idea when in fact it is a very old idea: central data processing with dumb terminals. Now if spare processing capacity on my 'smart phone' and 'smart watch' and even my 'smart refrigerator' could supplement what I need for my PC or tablet now that might be an interesting and new idea.
I'll stick with my original argument. Smartphones and consoles will always serve their limited purposes, and add things as appropriate, because technology and engineering is purpose driven, not dominance driven.
As much as I hate analogies, screw it, I'm gonna make one here. From a gaming standpoint:
A PC is a house. It stores all your stuff and keeps it somewhat secure. Sure, you could rent a storage space(a cloud), but good luck with that.
A console is a mobile home. You can't store a whole lot, but you can live in it.
A smartphone is a car. It's the best way to "take your house with you". But it's still pretty limited with what you can do with it. And it will never be as effective as your house.
Eh, I'm not anti analogy as most are on these forums as they're just meant to demonstrate similarities; they can all be knocked down with ease.
But I have to hand it to you, that was a sweet analogy. Well done, Robsolf.
"Mr. Rothstein, your people never will understand... the way it works out here. You're all just our guests. But you act like you're at home. Let me tell you something, partner. You ain't home. But that's where we're gonna send you if it harelips the governor." - Pat Webb
I remember buying my first PC. I wanted to make sure that it had enough HD space for Ultima 7. I think I ended up spenind 3K on a 486DX with a 100gig HDD..
After that I remember upgrading something almost yearly and then it was every two years. . usually when a game came out that I could absolutely even not run. ie. Wing Commander IV.
Today I upgrade because I want ultra settings or faster frame rate.
I have not bought a new PC in over 7 years. Do they consider Motherboards a a PC purchase? I am sure I have purchased a complete PC in parts during that time but a whole system . . nope.
To compare I am on my second tablet in 3 years.
A lot of people I know who thought tablets where the second coming now barely use them. . they have become e-readers and web browsers for the washroom and bed.
Wa min God! Se æx on min heafod is!
What if gaming becomes streamed so no need for high end hardware just an interface ? Also I really like the looks of this.
https://www.youtube.com/watch?v=aThCr0PsyuA&t=94
https://www.youtube.com/watch?v=b6sL_5Wgvrg
This is no longer science fiction.
The thing is it is architecturally better to have the processing power as close to the user as possible. Thin clients are an old idea and do not lead to the savings promised.
With regards to the videos, lots of Microsoft hype. Lets wait until one is actually for sale, and I bet that in addition to the headgear you are going to need a pretty powerful PC locally to actually run the thing.
Oh and did you notice how many audience members where using Apple notebooks in that Windows presentation?
Maybe Microsoft did buy Apple through the back door ? :P
I cannot see the home PC being dead but I do see it evolving from being a desktop/laptop, I think desktop/laptop will be more for industry than home. We are getting closer to everything going through one device.
I work with AutoCad, Tekla, Inventor, StruCad, Solidworks.
Besides that I like to play 3d shooters and other 3d games.
There is no way at this moment that you can use those programs efficiently on a tablet.
People who use tablets fulltime are only playing flash games, watching Youtube, go to Facebook and other small stuff like that.
Working professionally with tablets and use 3d applications is a no go.
"going into arguments with idiots is a lost cause, it requires you to stoop down to their level and you can't win"
Yes, it certainly does appear to be cyclic, depending on advances in technology.
Four decades ago, companies running mainframes and minis used to send their data to "processing bureaus" on magnetic tapes to do things like payroll and billing runs.
As the tech advanced, the in-house machines became powerful enough to perform those activities themselves. The processing bureaus vanished.
Nowadays companies are once again using "specialist partners" for all kinds of functions like Debtors management and payroll processing, because networking tech advanced to the point where the data exchange can take place in near real time. It also means that the business no longer needs to employ the (expensive) specialist knowledge that is needed to administrate these increasingly complex functions.
"Big Data" is the current driver of change. Analysing vast volumes of data to find trends is the "hot topic". Cloud computing makes that a lot easier to achieve in a cost-efficient way, because of the ease of scaling.
With "Big Data", the balance of power is once again shifting away from the "edge" to the "center". Individual PC's are simply not able to deal with these data volumes when it comes to processing and storage/retrieval. Devices on the edge (like PC's) are requesting, displaying and manipulating results generated at the center (i.e. in the Cloud).
The vast majority of home users are using their PC's as entertainment or internet browsers. As those functions can increasingly be handled by smaller, more mobile devices, the need for PC's will be reduced substantially.
I'm wondering why you would label a ZX Spectrum a console? To me it's a personnel home computer or PC. I had one as a kid (+3) and I could do more than just game on it. I could program, word process, print. It was widely regarded as a PC at the time.
Unless I'm missing something?
Ey bro, would love to see you render those SpecFX 99% of hollywood movies use on an iPad