Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

I think almost no one will play MMO at 4k resolution.

2

Comments

  • AdamantineAdamantine Member RarePosts: 5,093

    > 1. Bill Gates never said that

    Who cares. The reason why this cite lives is because its a theory thats coming up all the time. The Gates saying maybe wasnt genuine, but it points out the stupidity of such assumptions very well. So even if it didnt happen, its still well invented.

    Also, Microsoft DID INDEED built in an unnecessary limit of 640 KB into a machine that could saftely address 1024 KB.

     

    > 2. We have reached many limits in technology where there are such diminishing returns that it's pointless to increase the quality because our senses can't perceive it.

    "Full HD" has 2 Megapixels resolution (1920x1080). As do the older mars rovers spirit and opportunity (1600x1200).

    When digital photography was young, everbody believed 3 Megapixels would be sufficient for anything.

    Then it was 6 Megapixels.

    Later 12 Megapixels.

    Nowadays the limit is supposedly 20 Megapixels.

    I found an article of a photographer who came to the conclusion that for the largest prints at best printing quality he would need 160 Megapixels. The highest single short resoltion of commercial cameras right now is 80 Megapixels.

    Large format 8x10' cameras apparently may produce hundreds of megapixels resolution, depending upon the film material used.

    Special monochrome films can reach gigapixels of resolution.

    So - sorry, but nope. The human eye has only 6 Megapixels - but most of this resolution is in the center of vision, which is only in a 3 degree area, and we move this center of vision around anywhere we want to look.

    The 1920x1080 of older monitors is nothing. The 3840x2160 resolution of newer monitors is still far from the end of the road.

     

  • Stone_FountainStone_Fountain Member UncommonPosts: 233
    An MMO on high gfx settings is alot more cruel to your Frame rates than any single player game. Especially when alot of toons are around you. Spells, emotes and movement will severely hamper your frame rates in high gfx. You can have a great internet connection and a very powerful PC but alot of spontaneous activity on the screen will really test frame rates. 4K monitors do not have very good response times and I'm not going to pay mega bucks for a monitor that could be better spent inside my tower. Also, if you go the VR route you don't need a monitor anyway. 

    First PC Game: Pool of Radiance July 10th, 1990. First MMO: Everquest April 23, 1999

  • dave6660dave6660 Member UncommonPosts: 2,699
    Originally posted by Robokapp
    Originally posted by rojoArcueid
    Originally posted by NomadMorlock

    My understanding is that anything over 30 FPS cannot bee seen by the naked eye, therefore 4x the resolution allows the game to be more beautiful and detailed (significantly) as long as the FPS never drops below 30.   

     

    Wouldn't you want to have 4x the resolution instead of 200 FPS that looks the same as 30FPS?

    you cant see the difference between 30FPS and higher FPS? 

    I personally dont care about 4k, as long as i get 60FPS in my games im happy. At the moment i dont get it with many games on my old PC but damn the difference is so noticeable its not even funny. Im really sick of some devs jumping into the "cinematic" 30FPS bs. If higher resolution will limit the frames per second to 30 due to lack of stronger tech then ill welcome 1080 60fps over 4k any day.

    can you tell 60fps from 120 fps?

     

    I cant.

    For very fast moving objects you might be able to tell the difference.  The interpolation between points starts to show.  For the most part I cannot tell the difference either.

    “There are certain queer times and occasions in this strange mixed affair we call life when a man takes this whole universe for a vast practical joke, though the wit thereof he but dimly discerns, and more than suspects that the joke is at nobody's expense but his own.”
    -- Herman Melville

  • JJ82JJ82 Member UncommonPosts: 1,258

    The real reason most wont do it is do to the fact that no MMOs are designed around it...you will gain nothing from playing current MMOs in 4k with the kind of graphics they have.

    Make another thread about this in 3-4 years.

    "People who tell you you’re awesome are useless. No, dangerous.

    They are worse than useless because you want to believe them. They will defend you against critiques that are valid. They will seduce you into believing you are done learning, or into thinking that your work is better than it actually is." ~Raph Koster
    http://www.raphkoster.com/2013/10/14/on-getting-criticism/

  • AxehiltAxehilt Member RarePosts: 10,504
    Originally posted by Kiyoris

    Some MMO use the argument they support 4k resolutions (in the settings).

    I just don't think it makes any sense.

    I hae been looking at commonly sold video cards and the hit they get when they go form 1080P to 4k.

    The amount of FPS you lose by going to 4k is massive.

    It's a 3x to 4x reduction in FPS.

    Considering each new iteration of GPU increases performance by barely 10%-15%, a 400% hit in performance by using 4k is hard to justify.

    Up to 4 times lower FPS by going from 1080P to 4K.

    It makes no sense that:

    1. People would put up with a 4x drop in framerates

    2. People would justify the extra cost in power consumption / hardware price for such horrible gains when they could be playing at 4 times the FPS for cheaper

    Sure, we went from 640x460 to 800x600 to 1024x768 and to 1080P, but those changes resulted in easily verifiable gains in graphics quality, you can easily SEE the difference between 800x600 and 1080P.

    However, this, from 1080p to 4k, which is barely visible, and causes a massive drop in performance, makes no sense to me.

    What exactly are you trying to say?  That this will never be adopted?  That's crazy.  We've watched the slow march up from early resolutions to the current norms, and that march will continue until the improvements stop being discernible to the human eye.

    As others have said, the cutting edge isn't for average people with average machines.  It's for the 1% with the fastest machines.

    "What is truly revealing is his implication that believing something to be true is the same as it being true. [continue]" -John Oliver

  • DAS1337DAS1337 Member UncommonPosts: 2,610

    I don't know anything about all that technical BS.  What I do know is, I could tell a massive difference between my 60hz and 120hz TV.  I could tell the difference when I bought my 144hz monitor, though not as big a difference due to size I believe.  So I can tell a difference.  A friend I know has a 240hz TV and it's like watching a real person standing in front of you.  I've seen 4K TV's at the store and they are so smooth and detailed, it's crazy.  I don't care about all the crap you guys are talking about.  I can tell the difference, I don't care if it's frame rate or smoothness, or what.  I can tell.

     

    If I had the money, I'd consider buying a 4K monitor.  For sure.  Assuming I had enough money to rebuild my PC.  I think it may cost a little too much right now.  I'd likely need a second GPU to even think about running games as nicely as I do now at 1920x1080 or whatever it is.  Max graphics.

  • DAS1337DAS1337 Member UncommonPosts: 2,610
    Originally posted by Axehilt
    Originally posted by Kiyoris

    Some MMO use the argument they support 4k resolutions (in the settings).

    I just don't think it makes any sense.

    I hae been looking at commonly sold video cards and the hit they get when they go form 1080P to 4k.

    The amount of FPS you lose by going to 4k is massive.

    It's a 3x to 4x reduction in FPS.

    Considering each new iteration of GPU increases performance by barely 10%-15%, a 400% hit in performance by using 4k is hard to justify.

    Up to 4 times lower FPS by going from 1080P to 4K.

    It makes no sense that:

    1. People would put up with a 4x drop in framerates

    2. People would justify the extra cost in power consumption / hardware price for such horrible gains when they could be playing at 4 times the FPS for cheaper

    Sure, we went from 640x460 to 800x600 to 1024x768 and to 1080P, but those changes resulted in easily verifiable gains in graphics quality, you can easily SEE the difference between 800x600 and 1080P.

    However, this, from 1080p to 4k, which is barely visible, and causes a massive drop in performance, makes no sense to me.

    What exactly are you trying to say?  That this will never be adopted?  That's crazy.  We've watched the slow march up from early resolutions to the current norms, and that march will continue until the improvements stop being discernible to the human eye.

    As others have said, the cutting edge isn't for average people with average machines.  It's for the 1% with the fastest machines.

    No, clearly he's saying that right now, the improvements you'd get versus the cost it would take to get you there are not worth it.  I would have to agree.  Most people here as well do not fall into that 1%.  And it does not make you cool to lie about it.  No one cares.

     

    I don't really get what you are attempting to argue or why you decided to quote his post. 

  • Leon1eLeon1e Member UncommonPosts: 791
    Originally posted by dave6660
    Originally posted by Robokapp
    Originally posted by rojoArcueid
    Originally posted by NomadMorlock

    My understanding is that anything over 30 FPS cannot bee seen by the naked eye, therefore 4x the resolution allows the game to be more beautiful and detailed (significantly) as long as the FPS never drops below 30.   

     

    Wouldn't you want to have 4x the resolution instead of 200 FPS that looks the same as 30FPS?

    you cant see the difference between 30FPS and higher FPS? 

    I personally dont care about 4k, as long as i get 60FPS in my games im happy. At the moment i dont get it with many games on my old PC but damn the difference is so noticeable its not even funny. Im really sick of some devs jumping into the "cinematic" 30FPS bs. If higher resolution will limit the frames per second to 30 due to lack of stronger tech then ill welcome 1080 60fps over 4k any day.

    can you tell 60fps from 120 fps?

     

    I cant.

    For very fast moving objects you might be able to tell the difference.  The interpolation between points starts to show.  For the most part I cannot tell the difference either.

    Evidently you can't see this because of having a 60Hz monitor. It simple *can't* display over 60 frames per second. *Physically impossible*. If you have VSync, having 59 fps throttles your screen to 30 Hz so you still don't see the whole thing. Thus, quite hard to notice the difference. 120Hz screen is a whole another difference. G-Sync/Adaptive Sync is for you 60Hz mid-range GPU guys. 

  • NanfoodleNanfoodle Member LegendaryPosts: 10,875
    I remember many stages of graphics being out of range of the average gamer. 4k has a much likely hood of being the new standard as anything that came before it. 1080p is now just getting a firm hold for the average home. So its gona be a while IMO but it will come.
  • AxehiltAxehilt Member RarePosts: 10,504
    Originally posted by DAS1337

    No, clearly he's saying that right now, the improvements you'd get versus the cost it would take to get you there are not worth it.  I would have to agree.  Most people here as well do not fall into that 1%.  And it does not make you cool to lie about it.  No one cares. 

    I don't really get what you are attempting to argue or why you decided to quote his post. 

    I quoted his post because I was replying to it.  The easiest way to know someone is replying to you is to search your name in the thread and because the name pops up in the quote you can easily locate replies. 

    My post was like 5 sentences long.  If you couldn't understand what I was trying to say, it wasn't for lack of clarity on my part.  Obviously I'm asking why he would even bring up the topic, since clearly eventually players are going to use 4k resolutions in MMO gaming.

    "What is truly revealing is his implication that believing something to be true is the same as it being true. [continue]" -John Oliver

  • kreakrea Member UncommonPosts: 237
    Did bought myself a 4k tv while back think its 50 inch , tho even on that its hard to see difference think 4k shows best with very big screens and even the 50 inch is way to small to realy justify it atleast for me . So for gaming i dont see why its needed yet since pc screens are much smaller than like 50 inch tv so the difference is even harder to spot their.
  • madazzmadazz Member RarePosts: 2,115

    What a very strange thread. All gaming... ALL will be 4k someday, and even higher. All MMO's will be played in a standard 4k too. I don't understand how someone uses a computer and doesn't notice that things ADVANCE AND UPGRADE OVER THE YEARS. 

     

    I've been playing quite a few games in 4k lately. The one I will be playing today is ESO. Sure its not in the menu, but I found how to do it from a handy reddit thread.

     

    Im sorry guys, but I see no reason to even continue this thread. This is the same as someone saying "I think almost no one will play doom at 720p." Its utterly ridiculous. I wonder what OP thought when 640x480 was the standard.

  • RobsolfRobsolf Member RarePosts: 4,607

    If you finished that post title with "for a couple years at least.", I'd agree with you.  It will probably be at least that long for hardware to put out acceptable framerates at prices most people are willing to pay.  Once pixel density reaches the point to where it can't be determined by the naked eye, then I suspect that pixel density advancement will stop, or slow down dramatically.  We have probably at least 7-10 years before that's the case.

    In the meantime, there are tons of other improvements that are just as important, if not more.  Brightness, contrast, vividness, color depth and accuracy.  We're still a long way off in those departments.

    As to the audio argument, there is some validity there.  But it's really more of a recording/playback hardware issue than anything else.  Recording hardware, such as mics and preamps/mixers are still far from accurately recording the actual dynamic range of live instruments.  This is fine, considering that current speakers and amps on a consumer level are far from being able to accurately play it back.  Until we have the ability to record live audio and have it be identical to hearing it live, we'll have no idea whether 16 bits @44k, let alone MP3 will remain an acceptable playback format.

    And this brings me around to what I feel is the most important point:  The reference baseline is all wrong in this topic.  The point isn't "monitor tech advancement ends when the previous generation looks just as good as the new generation".  Video Tech advancement ends when you look at video or an image of a person/place/thing on the monitor and it looks just like it does when you see it in real life.

    And we are a long, long, LONG way from that.

  • madazzmadazz Member RarePosts: 2,115
    Originally posted by krea
    Did bought myself a 4k tv while back think its 50 inch , tho even on that its hard to see difference think 4k shows best with very big screens and even the 50 inch is way to small to realy justify it atleast for me . So for gaming i dont see why its needed yet since pc screens are much smaller than like 50 inch tv so the difference is even harder to spot their.

    Its not about the size of the TV. Its about your viewing distance. With a 4k TV you can sit closer and see more definiton. If you sit too far back you may as well have a standard 1080p TV. Also, I doubt you are even viewing 4k content on your 4k TV. Do you even have the proper HDMI cable?

  • UtinniUtinni Member EpicPosts: 2,209
    People said this about HD too. My dad still says he "can't tell the difference" between 480p oldschool CRT television and new. I notice a huge difference in quality on WoW playing in 4k over 1080p. In a few years they won't even make 1080p stuff.
  • kreakrea Member UncommonPosts: 237
    Originally posted by madazz
    Originally posted by krea
    Did bought myself a 4k tv while back think its 50 inch , tho even on that its hard to see difference think 4k shows best with very big screens and even the 50 inch is way to small to realy justify it atleast for me . So for gaming i dont see why its needed yet since pc screens are much smaller than like 50 inch tv so the difference is even harder to spot their.

    Its not about the size of the TV. Its about your viewing distance. With a 4k TV you can sit closer and see more definiton. If you sit too far back you may as well have a standard 1080p TV. Also, I doubt you are even viewing 4k content on your 4k TV. Do you even have the proper HDMI cable?

    Yes i have the proper cable tho 4k also works on normal hdmi cable think its limited to 30 fps if i am correct ,and watching 4k mostly on netflix since well their isnt that much content out yet :)

  • ReizlaReizla Member RarePosts: 4,092
    Originally posted by DAS1337

    I don't know anything about all that technical BS.  What I do know is, I could tell a massive difference between my 60hz and 120hz TV.  I could tell the difference when I bought my 144hz monitor, though not as big a difference due to size I believe.  So I can tell a difference.  A friend I know has a 240hz TV and it's like watching a real person standing in front of you.  I've seen 4K TV's at the store and they are so smooth and detailed, it's crazy.  I don't care about all the crap you guys are talking about.  I can tell the difference, I don't care if it's frame rate or smoothness, or what.  I can tell.

    I think your statement here kinda is what a lot of people here tell about seeing a maximum number of FPS.

    TVs are a lot bigger than monitors. For my TV (30", 1080i) I recently changed from SD to HS satellite and boy did I see the difference. The same as with the old (SONY microblack trinotron) CTR TV to the current 30" TV we have now - that was a huge difference as well.

    Your statement shows just that size of the TV (4K starts at what, 50"?) is what makes you see the difference. If that huge TV is 60Hz or 120Hz, the speed makes up against the 'flickering' of the build up of the screen every second. That same 60Hz and 120Hz difference on my TV would not be such a big deal and I most likely wouldn't notice (and I have very good eyes a doctor said - I could read those tiny 1.5 letters on a test board).

    The same goes for resolution. The larger the TV (or monitor), the higher the resolution should be. I would not want to see 1080p on a 50" monitor for sure, unless I'm at least 8 meters away. Getting too close to the TV just would reveal the pixels on it. Add to that, that I still don't get why people in small rooms (3-4meters) have TVs that are over 40" - it's just a drop in quality...

    And from big-ass TVs to monitors... I know a lot of people want bigger resolutions for their games. 4K, 8K, 16K, what's the limit? Perhaps what I stated above? What's the distance you're away from your monitor? My monitor (22") is a bit over an arm's length away from my face. No need to get over 1080p that means. If you want to use 4K, you need 27" minimum iirc. That means that because of the bigger screen you need to sit a further away from the monitor (1.5 meter or so?). With this extra distance I doubt you'd see much of the extra detail a 4K monitor gives over a 1080p monitor (you will see some difference, but not all of it though).

    Last but not least, as you see in my siggy I have 3 monitors (5760x1080). It's awesome to play games on it from time to time. But once again, that's a lot of 'ground to cover' with my eyes and with that ~80cm from the monitors I sit it's not as cools as you might think. To fully enjoy the 3 monitor configuration in games I need to sit about 2 meter from the monitors to have a full view. Good thing though that I'm not using them for gaming but programming instead :p

  • IncomparableIncomparable Member UncommonPosts: 1,138

    The detail would appear to be smoother with smaller pixels. However, as slight of a benefit as that may be, it is more evident on a larger screen.

    I think people that play at 4k settings normally use multiple monitors as well. Its not 4k on just one monitor. and its not 4k per monitor... i think that would require an uber computer, but just stretching out what current well priced gaming pcs would cost and handle with a 980.

    And it seems poeple complain at 30 fps, but with gsync or free sync those things become non issues. and if its above 30 fps it should be a non issue.

    but tbh, video stutter bothers me a lot... so i am not going to encourage barely hitting 30 fps with a budget gaming pc and over spend on multiple monitors.

    “Write bad things that are done to you in sand, but write the good things that happen to you on a piece of marble”

  • raazmanraazman Member UncommonPosts: 35

    The people who say there is no difference after 30 fps have no clue what they are talking about. I bet you all have never had the opportunity to see it in person. It's night and day. If you can't tell the difference, see an eye doctor. And 4k gaming is very very possible. Why did OP use those cards. Yes it's still in the beginning stages but in the net few years it should be very possible by mid tier cards. But currently, I can play on 4k. It is possible.

    It seems as if there is so much misinformation going around, especially by people who have probably never seen it in person.

  • waynejr2waynejr2 Member EpicPosts: 7,771
    Originally posted by Nanfoodle
    I remember many stages of graphics being out of range of the average gamer. 4k has a much likely hood of being the new standard as anything that came before it. 1080p is now just getting a firm hold for the average home. So its gona be a while IMO but it will come.

    People said they wouldn't need graphics cards back in the day now try to imagine not playing with one.  It is the future and  until 8K replaces it.

    http://www.youhaventlived.com/qblog/2010/QBlog190810A.html  

    Epic Music:   https://www.youtube.com/watch?v=vAigCvelkhQ&list=PLo9FRw1AkDuQLEz7Gvvaz3ideB2NpFtT1

    https://archive.org/details/softwarelibrary_msdos?&sort=-downloads&page=1

    Kyleran:  "Now there's the real trick, learning to accept and enjoy a game for what it offers rather than pass on what might be a great playing experience because it lacks a few features you prefer."

    John Henry Newman: "A man would do nothing if he waited until he could do it so well that no one could find fault."

    FreddyNoNose:  "A good game needs no defense; a bad game has no defense." "Easily digested content is just as easily forgotten."

    LacedOpium: "So the question that begs to be asked is, if you are not interested in the game mechanics that define the MMORPG genre, then why are you playing an MMORPG?"




  • crasset15crasset15 Member UncommonPosts: 194

    After some nitpicking in archeage and AC:revelations in 1080p, the only thing that visibly bothers me (but only after i specifically searched for it) is the way distant objects are rendered. Trees for example, if they are very detailed, like a palm tree with many leaves, and the game is trying to render every single leaf from a 100 meters away,  feels "glittery" and jagged.

    Also a distant ship that was no larger than a forum avatar here, had it's mast ropes only partially render, with gaps in between where there was no rope color.

    Couldn't see any difference in a nearby house wall, looking at it at a shallow angle in 720p or 1080p.

    But these are just little nitpickings. They are usually things that I don't pay attention to because they are in the corner of my eye. They create context, but I'm not directly paying attention to them when I'm focusing on whatever task I'm doing in the game. Sure if I stop and admire the scenery, I notice them, but they don't bother me when I'm actually playing.

  • Originally posted by NomadMorlock
    Originally posted by Yaevindusk
    Originally posted by NomadMorlock

    My understanding is that anything over 30 FPS cannot bee seen by the naked eye, therefore 4x the resolution allows the game to be more beautiful and detailed (significantly) as long as the FPS never drops below 30.   

     

    Wouldn't you want to have 4x the resolution instead of 200 FPS that looks the same as 30FPS?

     

    The human eye does not see in frames per second.

    Wow..not too bright there.

     

    The human eye can see the difference in smoothness on any game running below 30 FPS. Once you are over that threshold, it appears the same regardless of how man more frames per second are rendered. 

    one of the oldest trolls in the troll book. Please stop feeding it.

  • DilligDillig Member UncommonPosts: 123

    I am currently playing games on 39" 4k TV with a GTX 970 video card and the mmo's I been playing look absolutely fantastic.

     

    I get about 45 to 55 fps also. So let me just say I have no plans to go back to 1080p

  • QuirhidQuirhid Member UncommonPosts: 6,230
    4k resolution gaming in 3d with 100 to 120 fps for each eye. Oh baby... That's something I want to experience.

    I skate to where the puck is going to be, not where it has been -Wayne Gretzky

  • KaladinKaladin Member Posts: 468
    As long as I can play my SNES on a 4k tv, I'll be content.

    I can fly higher than an aeroplane.
    And I have the voice of a thousand hurricanes.
    Hurt - Wars

Sign In or Register to comment.