গেমগুলি কেন উইন্ডোর আকারের স্বয়ংক্রিয়ভাবে ফিট করার পরিবর্তে স্ক্রিন রেজোলিউশনের জন্য জিজ্ঞাসা করে?


32

আমার কাছে মনে হয় এটি 3 ডি বা 2 ডি স্ক্রিনের উপর নমনীয়, প্রতিক্রিয়াশীল ইউআই লেআউটটি প্রয়োগ করা আরও যুক্তিযুক্ত, পুনরায় ব্যবহারযোগ্য এবং ব্যবহারকারী-বান্ধব হবে, যা কোনও স্ক্রিন রেজোলিউশনে চালানো যেতে পারে।

কিছু আধুনিক গেমস স্ক্রিন রেজোলিউশনকে স্বতঃ-সনাক্ত করে এবং সেই সাথে গেমটি সামঞ্জস্য করে তবে রেজোলিউশন পরিবর্তন করার বিকল্পটি এখনও সেটিংসে রয়ে গেছে।

কেন এই বিকল্প?

আমার ধারণা হ'ল কম গ্রাফিক্স রেন্ডার করে এবং কেবল স্ক্রীন জুড়ে তাদের প্রসারিত করে পারফরম্যান্স উন্নয়নের একটি উপায় সহ পুরানো মেশিন সরবরাহের এটি একটি উপায়, তবে অবশ্যই পারফরম্যান্সের উন্নতির আরও ভাল উপায় রয়েছে (উদাহরণস্বরূপ, বিভিন্ন টেক্সচার এবং মডেল গুণাবলী বেছে নেওয়া) ।


1
এখানে যা বলা হয়েছে তা ছাড়াও, আমি মনে করি কিছু গেমগুলি 'নিরাপদ' হিসাবে সর্বনিম্ন রেজোলিউশনগুলির মধ্যে একটি চয়ন করে: প্রায় সব হার্ডওয়্যার কনফিগারেশনই এটি সমর্থন করবে, সুতরাং গেমগুলি ক্র্যাশ না করে শুরু হয় এবং তারপরে ব্যবহারকারী রেজুলেশনটি ইন-গেম পরিবর্তন করতে পারে ।
মাইক্রোভাইরাস

@ মাইক্রোভাইরাস আমি এটির চেয়ে অনেক পরে দ্বিতীয় বারের মতো নিরাপদ দিয়ে আরম্ভ করার সম্ভাবনাটি খুব পরিবর্তনশীল ফর্মফর্মের জন্য একটি ভাল পরিকল্পনা।
মান

4
" আমার কাছে মনে হয়েছে যে 3 ডি বা 2 ডি স্ক্রিনের উপর নমনীয়, প্রতিক্রিয়াশীল ইউআই লেআউটটি প্রয়োগ করা আরও যুক্তিযুক্ত, পুনরায় ব্যবহারযোগ্য এবং ব্যবহারকারী-বান্ধব হবে " - এটি আপনার কাছে মনে হয়, তবে বাস্তবে এটি না করা প্রায়শই সহজ এবং সস্তা।
সুপারবেষ্ট

@Superbest: একটি counterargument হিসাবে, কৌশল বিশ্ব সুইচড একটি "নির্দিষ্ট আকার যৌক্তিক UI 'তে" (পর) এর। ইউআই হ'ল 768 "স্কেলড পিক্সেল" উচ্চ, এবং একটি দিক-উপযুক্ত প্রস্থ: wowwiki.com/UI_coordinates (সাধারণত 1024x768, 960x768, বা 1229x768)
মাকিং হাঁস

Perhaps a responsive UI layout fine where a game does not require high actions per minute. I know some WarCraft and StarCraft players prefer a fixed layout so they can take advantage of their muscle memory.
Sun

উত্তর:


51

This lets the user choose the game's quality versus its performance. Some prefer higher-quality graphics settings on a lower resolution, others the opposite. Some computers can handle maximum settings on everything, some can't.

Devices with homogenous hardware (PlayStation, Xbox, iPhone...) usually don't offer graphics settings for this reason.


4
Worth noting that performance of some games highly depends on screen resolution. Try lowering your game's res if it's under-performing to see if that's the case.
user1306322

1
Also worth noting that lowering resolution will address performance bottlenecks occurring as a result of inadequate GPU fill rate and shader throughput. Shader quality is not always linked to resolution but for ambient occlusion, anti-aliasing and other edge detection, it may well be!
Gusdor

29

There is no way to completely reliably detect the correct screen resolution.

One approach is to simply leave at the user's desktop resolution. This is annoying, as I know a number of people (some with visual impairments) who prefer to run their desktop at a lower resolution to make things look larger, but still prefer games at the native resolution where small text and details are rarer and less critical.

Another approach is to consult the monitor's supported modes list. This is similarly unreliable: Some monitors provide no mode list or provide it incorrectly. Some provide resolutions higher than the native one, which they are able to hideously down-sample while taking a performance hit...

In a way, your question applies to any setting a game provides. "Why should I ask the user when I can guess what's best for them?" Why would I ask they about quality settings when I can detect their hardware? Why would I ask about FOV if I know what the game looks best with? Why should I let users turn of AA if their computers can handle it? This more general question highlights the real answer, which is twofold:

আপনার অনুমানগুলি কখনও কখনও ভুল হয় কারণ ভাল অনুমান করার জন্য আপনার কাছে সর্বদা সঠিক হার্ডওয়্যার তথ্য থাকবে না। যদি কোনও ব্যবহারকারীর গ্রাফিক্স কার্ডটি বাগড হয়ে যায়, ফলে কম্পিউটারটি একটি নির্দিষ্ট রেজোলিউশনে ক্রাশ হয়ে যায় এবং আপনি তাদের এটিকে অক্ষম করতে না দেন? আপনার গেমটি এখন তাদের কাছে অকেজো। এছাড়াও, ব্যবহারকারী পছন্দ পছন্দ করে । সম্ভবত কেউ "ভুল" হওয়া সত্ত্বেও কোনও অ নেটিভ রেজোলিউশনে খেলার अस्पष्ट চেহারা পছন্দ করে। সম্ভবত তাদের একটি সিআরটি মনিটর রয়েছে যা অনেকগুলি রেজোলিউশন সমর্থন করে এবং সর্বোত্তমটি সর্বোচ্চ নয়, বা তারা একটি কমতে আরও ভাল রিফ্রেশ হার পান।

In effect you cannot judge accurately what the user wants based on information from the computer, neither can you accurately judge what will work best. It is best left to the user (who knows their own computer) to decide.

(Note I've only covered PC here. Consoles are a different story, but have already been covered by others.)


17
+1; this particularly annoys me in games which do not give me a gamma slider (or at least a brightness setting). It's great that you found a look that you think makes your game look best on your monitors, but your monitor is not my monitor, and I @%&$ing hate dark environments where I can't see. I don't care if the black-on-black is "the point" of your stealth level, I want to see what I'm doing.
Brian S

1
"what if a users graphics card is bugged causing the computer to crash at a certain resolution and you will not let them disable it" ah, the good old days with D2 and the barbarian's battle cry that always crashed my computer :)
person27

You can pretty reliably detect native resolution, the resolution of the desktop is a good indicator. Virtually everyone runs that at native res, and it is at least a good starting point. However that doesn't mean you shouldn't offer the option to change it. For my current project, full screen will only work at native (desktop) resolution, and if people want to lower it for whatever reason, then either go windowed mode, or it would change the back render target size and just let the graphics card stretch it.
Programmdude

2
I had a recent experience beta-testing a game which I will not name as it's still unreleased, which in lieu of proper graphics settings auto-detected the user's screen resolution. It ended up so badly that it rendered the game completely unplayable for a lot of testers, and the workaround that the devs suggested was to change the monitor's screen resolution or refresh rate, and even that didn't work for everyone. Thankfully, this auto-detection is only part of the beta and the final game is presumed to have screen resolution options.
BoltClock

@BoltClock I understand that feeling entirely, my own monitor (A fairly high end Dell ultrasharp) reports a bogus list of modes, if software just looks for the highest one and displays it either I get weird graphics artifacts or it will not display at all.
Vality

7

There are a VAST number of reasons to allow the user to control the settings for their game.

  • MANY people have 2 (or more) monitors these days. The user should be able to determine which one to play the game on.
  • There are thousands of different devices a user could be using, and no way to reliably tell what setting would be optimal for every one.
  • Some users may prefer less than "optimal" settings for reasons ranging from better game performance, to poor eyesight (lower resolution = bigger text!).
  • Some users may prefer to play the game windowed, so they can have their chat boxes, walkthroughs, music player, or other programs visible at the same time as the game.

That said, many games DO auto-detect, and then set the initial settings to whatever it thinks is the best your machine can handle.


1
Excluding windowed mode, are there any games that let you choose which monitor to run on? I've been multi-monitor for a number of years and don't ever recall a full screen game giving the the option to run on something other than my primary display.
Dan Neely

I can't recall specifics, but I've had at least a few games take over my secondary monitor, and I've occasionally had games that changed my monitors settings ... If I've got the resolution on my monitor other than the default ... well there's probably a reason.
aslum

2
I've seen quite a few games with this as an option, usually called 'display adaptor' in the menu. EVE Online is one example.
Ben

5

That's because the cost and effect of texture quality, geometry detail and screen resolution are very hardware-dependent.

The texture quality usually does not have much impact on the speed of the rendering pipeline, but only when they are read from GPU memory. When not all textures fit into the GPU memory, they need to be read from normal RAM or even worse from the hard drive cache, which affects performance negatively. Reducing geometry* and omitting expensive effects** won't help much. But when the execution speed of the rendering pipeline is the bottleneck, reducing texture resolution won't help much either.

Vertex shaders are usually unaffected by the output resolution. The only way to reduce the load on them is to reduce quality and quantity of the 3d models in the scene.

But any pixel shaders still scale linearly with the number of pixels on the screen. Reducing the screen resolution is still an important tool to improve performance. Having half the horizontal- and vertical resolution means you have just a quarter of the calls to the pixel shaders.

In contrary to the older CRT displays, modern LCD or plasma screens have a native pixel-resolution. When they are fed with a video stream in a different resolution, they need to interpolate. Some interpolate much better than others which means that running them on a lower resolution doesn't reduce the picture quality a lot while other monitors really look bad when not run on their native solution (the first LCD screen I owned used nearest-neighbor interpolation, which looked horrible. With my current screens, it's hard to tell when they don't run on the correct resolution).

The game engine can not know how well the monitor of the user interpolates, so it's better to leave the choice to either reduce texture and geometry detail or reduce the screen resolution to the user.

*) OK, reducing geometry might help a bit because vertices also consume GPU memory.

**) unless, of course, omitting these effects means that some textures are no longer required


1
Just to nitpick: CRTs have a "dot pitch" which means that they really DO have a native resolution. The phosphors in a CRT have a decay time which means a CRT has an ideal refresh rate as well.
Zan Lynx

@ZanLynx But as far as I can tell, it was never possible to get logical scanlines to align perfectly with CRTs' physical shadow-mask elements the way one can with an LCD.
Damian Yerrick

2

Not only the predefined resolutions settings make it possible to react with performance in some way, but creating a universal approach, which fits all kinds of resolutions, ratios and dpis is just way much harder to create.

If you create only few different resolutions, then it is user worry to choose best looking for him (especially when his resolution/ratio is unusual). If you make one universal design, you are responsible for making the game looking perfect on all the devices (could be especially hard on mobiles nowadays, espiecialy not on iOS, but on the other platforms with a huge variety of devices).


2

Additional reason: Changing environments.

1080 is now standard, but wasn't 5 years ago. 4k is entering the picture and how long until 6k/8k/1080k (or whatever) enters the picture? What about gaming on an 8 year old computer that tops out at 1280x720? 1024x768? 640x400?

My monitor supports 4k native, but It'd choke on the bandwidth required and max out at 30fps. Let me switch to 1080 (or lower) so that the game runs at 60fps and the text/hud/etc isn't microscopic.

(Additionally: Give me windowed / maximized windowed as well please so I can tab around easily lol)


3
1080k... That's 1 million times the amount of pixels in a 1080p display. That's 2 trillion pixels, would require at least 8TB of VRAM and requires around 4000 terabits/s to drive. I would really like to see such a display, and the amount of power it would consume...
Panda Pajama

0

My guess is that this is one way to provide older machines with a way to improve performance by rendering less graphics and just stretching them across the screen, but surely there are better ways of improving performance (choosing different texture and model qualities, for example).

That's not so simple. A game can run slow on my machine because it's stuck on fragment processing (perhaps lots of particle effects like smoke). Reducing model quality wouldn't take much load from here, and neither would reducing texture quality. On the other hand, decreasing the resolution reduces the load dramatically here (a switch from 1920x1080 to 1366x768 cuts the number of pixels to process by half).

And if my screen happens to be big - like a TV (like my current setup - I pretend my PC is a bigger xbox) - then, in a given viewing distance, bigger resolution doesn't give me much. So I have a good reason to turn the resolution down and perhaps even use the extra processing power to have more detailed lighting or shadows.

আমাদের সাইট ব্যবহার করে, আপনি স্বীকার করেছেন যে আপনি আমাদের কুকি নীতি এবং গোপনীয়তা নীতিটি পড়েছেন এবং বুঝতে পেরেছেন ।
Licensed under cc by-sa 3.0 with attribution required.