The general approach manufacturers have taken from the start to display the full spectrum of colors has been to break them down. Rather than designing complex pixels capable of displaying a multitude of shades, each pixel is made up of three sub-pixels, each displaying one of the primary colors: red, green, or blue.
When the user is located at a certain distance from the screen, he or she is no longer able to resolve each subpixel, but only the mixture of the three. This makes it possible to reproduce an entire palette of colors from various mixtures of red, green, and blue. All shades of gray can also be generated, from absolute black to bright white, by using all three primary colors in equal amounts.
Considering red, green, and blue to be primary colors might come as a shock to those who know something about painting, for whom the primary colors are red, yellow, and blue. What we're talking about here are additive primary colors, and according to the additive color model, red, green, and (RGB) are the primary colors.
All modern display technologies - CRT, LCD, and plasma - are based on this principle.
Screen resolution' is not a strange concept, yet there is a surprising ammount of people out there sitting by their computers not knowing anything about such basic concepts as 'screen resolution', 'colordepth' and 'graphic memory-usage'. We will try to change that now! :)
Different screen resolutions and colordepth can be adjusted on your computer, but how you do it depends on what computer you use (Windows95 /NT: Start: Settings /ControlPanel /Display /Settings), (Amiga: System:Prefs /ScreenMode), (Mac: 'TheApple' /ControlPanels /Monitors)
There is a simple relationship between the ammount of graphics memory and the maximum resolution/colordepth you can use. So, if you know how much graphic-memory you have in your computer, you can also calculate which resolutions/colordepths you should be able to display...
For non-professional users, the most common ammounts of graphic memory these days are between 8 & 64 MB.
The most common resolutions are 800*600 and 1024*768.
A resolution of 800*600 means that the viewable size is divided into 800 picture elements (picture elements=pix_els) horizontally and 600 pixels vertically. The total ammount of pixels in this case is: 800 times 600 = 480000
How much memory this resolution requires depends on the colordepth.
On the computers in use now, there are usually 3 alternatives when it comes to colordepth. Those are: 8-bit (256 colors), 16-bit (65 thousands of colors, a.k.a. HighColor) and 24/32-bit (16,8 million colors, a.k.a. TrueColor).
Computer memory is 'measured' in 'bytes'. e.g. kilo bytes (kB), Mega bytes (MB); One byte= 8 bits.
This is something that is valid for computers nowdays. In the past e.g., there have been computers where one 'byte' was only 4 bits .
Now, back to the 8-bit display...
On an 8-bit display (the 8 bits describe the color depth; amount of colors) every pixel occupies 1 byte in the computers graphic memory. In the example earlier we had a total of 480000 pixels. This means that the total memory usage is: 480000 (ammount of pixels) multiplied with 1 (1 byte occupies 1 pixel). The answere is still 480000.
Simply speaking, a 800*600 display with 256 colors requires 480000 bytes of graphic memory.
Now, if you wanted to have more colors, like 16-bit instead, how much memory would that require? Well, 16 bits means 2 bytes, since one byte is only 8 bits. 16bits/8bits=2
800*600= 480000 pixels in 16 bit means 480000*2=960000 bytes. Quite obviously, the 16-bit display requires twice as much memory as an 8-bit display in the same screen resolution.
The 24-bit display then, quite logically requries: 24bit/8bit=3bytes/pixel. i.e. every pixel uses 3 bytes which lead us to the conclusion that an 800*600*24bit display requires 480000*3 bytes= 1440000 bytes (1400kB or 1.4MB)
So if you happen to have only 1 Megabyte of graphic memory on your ancient GFX card, you should be able to display 800*600 in 16-bit but not in 24-bit (16,8 million colors). In real life though, other factors may me decisive such as your OS (Operating System)...
Here below I've put together a table with all the most common resolutions and how much memory they require in their respective color depths.
Alternatively you could use this table to see how much memory the display requires from your total ammount of graphics memory. If you i.e. play 3D games that use a lot of textures, you can calculate how much memory you have left for textures. [total memory]-[display memory]=[available memory for textures].
What is 'Refresh rate'?
Most people (computer users) are familliar with the term 'refresh-rate'. It is quite simply the rate at which your screen is being updated; refreshed.
For a stable, flicker-free picture, at least 70 refreshes/second are recomended. (For every 'refresh' the picure on your monitor is re-drawn) A refresh rate of 50 updates/second gives you a more 'flickery' display, and less is worse.. (Bare in mind however that this only applies to the type of monitors that use 'Cathode ray Tube' (CRT) technology. Which is basically all monitors that are not flat)
The 'Refresh-rate'is measured in Hz (Herz); 1Hz=1 time/sec
In ads for computer monitors you can sometimes see something called 'Horizontal Sweep Frequency'. I'll explain what that is below...
RefreshRate tells you how often the screen is updated.
Horizontal Sweep Frequency however, is the ammount of horizontal 'pixel-lines' the monitor can output/time unit... e.g. a resolution of 640 (width) * 480 (height) means that the screen consists of 480 horizontal lines that are 640 pixels wide each. The Horizontal Sweep Frequency (measured in kHz=kiloHerz) tells you how many of these horizontal lines that the monitor 'draws' every second. It is not your graphics card that does this job, but rather the monitor itself, so even if you have a very expensive graphics card in your computer, it is still the monitor that sets the upper limit for the quality of your display.
Real life example:
If you use a resolution of 800*600pixels, it means that you have 600 horizontal, 800 pixels wide, lines... Let's say you want to have your refresh-rate at 76 Hz. (Which would give you a nice flickerfree display).
What it means to the monitor is that it must 'draw' 600 horizontal lines 76 times/sec! 600*76= 45600
45600 horizontal -800pixels wide- lines is what the monitor must manage to 'draw' each second. 45600 Hz is the same as 45,6 kHz...
To sum it up:
45,6 kHz is the 'Horizontall Sweep Frequency' that your monitor must 'manage' if you are to display 800*600 at 76Hz!
That's it, wasn't very strange, was it?
Here below, you can see a table containing the most common resolutions, refresh-rates the required HSF's. (Horizontal...)
So, e.g. if you are buying a monitor and you want to use 1600x1200 @ 85Hz, make sure it manages at least 102 kHz Horizontal Sweep frequency. (Keep in mind though that this only applies for CTR (Catode Ray Tube) monitors. If you're buying a flat TFT screen, this is not relevant.)
|Resolution||Refresh-rate||H. Sweep Freq|
|640*480||60 Hz||28,8 kHz|
|640*480||76 Hz||36,5 kHz|
|640*480||85 Hz||40,8 kHz|
|640*480||100 Hz||48 kHz|
|800*600||60 Hz||36 kHz|
|800*600||76 Hz||45,6 kHz|
|800*600||85 Hz||51 kHz|
|800*600||100 Hz||60 kHz|
|1024*768||60 Hz||46 kHz|
|1024*768||76 Hz||58,4 kHz|
|1024*768||85 Hz||65,3 kHz|
|1024*768||100 Hz||76,8 kHz|
|1280*1024||60 Hz||61,4 kHz|
|1280*1024||76 Hz||77,8 kHz|
|1280*1024||85 Hz||87 kHz|
|1280*1024||100 Hz||102,4 kHz|
|Note that the resolution 1280x1024 doesn't have the same pixel aspect as the other resolutions. (Pixel aspect= Width/heigth. In this case it's 1.25 instead of the standard 1.33) While this isn't a problem in itself it may cause some distortion on LCD screens as the screen would have to be scaled unevenly.|
|1600*1200||60 Hz||72 kHz|
|1600*1200||76 Hz||91,2 kHz|
|1600*1200||85 Hz||102 kHz|
|1600*1200||100 Hz||120 kHz|
|2048*1536||60 Hz||92,2 kHz|
|2048*1536||76 Hz||116,7 kHz|
|2048*1536||85 Hz||130,5 kHz|
|2048*1536||100 Hz||153,6 kHz|