I've got this dell monitor se198wfp 19 inch with a native resolution of 1440x900, BUT, my graphics card (AMD Radeon 5450) supports the resolution 1920x1080. so can this monitor display 1920x1080 and look better than 1440x900?
Announcement
Collapse
No announcement yet.
Monitor's 1920x1080 vs. native resolution
Collapse
X
-
Re: Monitor's 1920x1080 vs. native resolution
No, 1440x900 is the max resolution supported by that monitor.
Some monitors do support setting resolution above their native, but it will result in a scrolling display.Last edited by ddscentral; 11-26-2014, 05:37 PM.
-
Re: Monitor's 1920x1080 vs. native resolution
^
Yes. The native resolution on the monitor will always have the sharpest picture. Most of the time, the pictures on LCD monitors will looks lightly fuzzy if the native resolution is not set.I love putting bad caps and flat batteries in fire and watching them explode!!
No wonder it doesn't work! You installed the jumper wires backwards
Main PC: Core i7 3770K 3.5GHz, Gigabyte GA-Z77M-D3H-MVP, 8GB Kingston HyperX DDR3 1600, 240GB Intel 335 Series SSD, 750GB WD HDD, Sony Optiarc DVD RW, Palit nVidia GTX660 Ti, CoolerMaster N200 Case, Delta DPS-600MB 600W PSU, Hauppauge TV Tuner, Windows 7 Home Premium
Office PC: HP ProLiant ML150 G3, 2x Xeon E5335 2GHz, 4GB DDR2 RAM, 120GB Intel 530 SSD, 2x 250GB HDD, 2x 450GB 15K SAS HDD in RAID 1, 1x 2TB HDD, nVidia 8400GS, Delta DPS-650BB 650W PSU, Windows 7 Pro
Comment
-
Re: Monitor's 1920x1080 vs. native resolution
Originally posted by c_hegge View Post^
Yes. The native resolution on the monitor will always have the sharpest picture. Most of the time, the pictures on LCD monitors will looks lightly fuzzy if the native resolution is not set.
And also, what is vsync? I don't see it in the AMD Catalyst control center.
Comment
-
Re: Monitor's 1920x1080 vs. native resolution
Don't bother with it. Simply put, it only matters in some games and most of the time it's best to leave it on the default setting (off).
Set the monitor to the panels standard resolution (or native resolution, 1440x900 or whatever it is) and configure the refresh rate to 60 Hz. LCD monitors don't really have the concept of refresh rates (in the same sense as old CRT monitors) and the electronics and panel don't really do more than 60 updates a second, so 60 Hz selected there is fine. Higher number won't do any better.
If text looks too tiny at the native resolution, don't change the resolution to fit your needs, right click on Desktop, select Screen resolution and then click on "make text or other items larger or smaller" (in windows 7), and from there you can make texts larger or smaller without affecting the rest of the image (windows, movies, applications etc)
If you currently use a VGA cable, you should also change it with a digital connection. The DVI connector on the monitor will give you a better image quality and the video card should have a DVI connector... so get a DVI cable. If it only has HDMI and VGA, then you can buy HDMI to DVI adapters for very cheap, a few dollars.
If you do this, then you would also have to check in Catalyst Control Center the option My Digital Flat Panels > Pixel Format and make sure you have RGB 4:4:4 Pixel Format PC Standard (Full RGB) selected. This option only matters when the connection between monitor and video card is digital, you may not find it when you use a classic vga cable (analogue).Last edited by mariushm; 11-26-2014, 10:32 PM.
Comment
-
Re: Monitor's 1920x1080 vs. native resolution
if you set the refresh rate higher than the panel it will just buffer and discard some of the frames.
you just generate extra heat on the videocard for nothing.
that's why i mentioned the vsync,
some people run games at 200-300fps thinking it's better and all they are doing is stressing the GPU for nohing!
Comment
-
Re: Monitor's 1920x1080 vs. native resolution
The monitor has analogue and digital input.
The old CRT monitors worked by moving an electron beam over the surface of the screen, heating up phosphorus dots on the glass to the point where they produce light. However, by the time that electron beam reaches the bottom of the screen the phosphorus dots start to cool down and they get darker.
So with CRT monitors, the electron beam had to move across the screen lots of times each second to maintain the brightness across the screen and prevent flicker.
The minimum number of updates is considered to be 60, hence the value of 60 Hz. However, a large majority of people have eyes more sensitive and a very common and recommended refresh rate was 75-85 Hz.
LCD displays work differently, they don't have an electron beam to heat up dots and produce light, so the brightness doesn't change, there's no flicker, there's no concept of minimum refresh rate. From the moment the monitor receives a picture, it takes about 3-15 ms to turn all the pixels on or off, so basically the image changes instantly and then it doesn't need to be refreshed constantly.
With LCD displays, that value of 60 Hz refers to the maximum number of opportunities to change the image on the screen. 60 times a second, the monitor receives a new snapshot and if there's some differences between the already displayed picture and the new picture, the monitor changes the pixels.
The processor inside the monitor is designed around this value of 60 because it's standardized, so if you select a larger value the processor inside the monitor will simply ignore the extra information it receives. A larger value selected there won't make any difference.
vsync stands for vertical synchronization. With CRT monitors, when the option was enabled, it forced the video card to wait until the electron beam reached the bottom of the screen and move back to the top.. then the video card worked on a new picture.
This prevented an effect called "tearing", which was most noticeable in games when you did a slow 360 rotation .. here's an extreme example: http://zoneitastuces.com/wp-content/...en-tearing.png
Tearing used to be a problem in the past but modern video cards and video card drivers aren't that much affected and it's more beneficial to leave that vsync off. Don't mess with it, if there's some bad (or unoptimized) game that displays tearing then you would be able to enable it from game's menu or by creating a game profile for that particular game.
A radeon 5450 doesn't have that much processing power, you won't save energy if you enable vsync and fix the update rate to 60fps instead of letting a game run at 100-200 fps, or whatever fps it manages. If you have a game that works so fast on a radeon 5450, you're better off raising the image quality from game options until the framerate drops to around 60 fps.Last edited by mariushm; 11-27-2014, 12:35 PM.
Comment
-
Re: Monitor's 1920x1080 vs. native resolution
Originally posted by stj View Postif you set the refresh rate higher than the panel it will just buffer and discard some of the frames.
you just generate extra heat on the videocard for nothing.
that's why i mentioned the vsync,
some people run games at 200-300fps thinking it's better and all they are doing is stressing the GPU for nohing!Last edited by chozo4; 11-27-2014, 02:56 PM.Even crap caps can be useful... such as blank rounds for prop gunfights.
Comment
-
Re: Monitor's 1920x1080 vs. native resolution
Well, it's a good thing that you're wrong then.
Consoles don't limit the framerate to 30fps because they want to, they limit it because there's not enough processing power in the console to render everything and to also leave room for AI, sound processing and everything else.
It's also a simple and easy way to cut development costs, after all why spend money optimizing the code to run at 60 fps when you can just say it's for "cinematic" reasons your game only works at 30fps.
A lot of games aren't even 1080p or 1080i, they're using resolutions like 1440x1080 or 960x1080 and they upscale in hardware the image sent to the TV. You can see here a list of native resolutions: http://www.ign.com/wikis/xbox-one/PS...and_Framerates
Higher framerate does make a game better to play even though you still only get 60 updates per second on the screen, unless you go with a monitor capable of 120 or 144 updates a second.
It depends on how the game is built and a lot of other things... for example lots of multiplayer games have network code that works differently depending on the game's framerate (more network traffic, more updates, smother multiplayer if you have more fps)
Comment
-
Re: Monitor's 1920x1080 vs. native resolution
Originally posted by mariushm View PostHigher framerate does make a game better to play even though you still only get 60 updates per second on the screen, unless you go with a monitor capable of 120 or 144 updates a second.
It depends on how the game is built and a lot of other things... for example lots of multiplayer games have network code that works differently depending on the game's framerate (more network traffic, more updates, smother multiplayer if you have more fps)
Comment
-
Re: Monitor's 1920x1080 vs. native resolution
Originally posted by stj View Posttearing depends on what your doing,
if you like side-scrolling games like i do then tearing is pretty much constant without vsync.
your less likely to notice any on 3D first-person stuff.
Comment
Comment