Display vs Screen for resolution
Created: Last updated:
Most people do not think much about display or screen resolution and when they probably have a good understanding of what it or they mean. No matter if you have no idea at all or think you know the answer read on and let me tell you what I know about resolution with displays and screens for your monitor and computer.
Lets begin with some background knowledge first and how we came to the point where we are today. I think it really helps to understand what the numbers mean when you know how computers generate output.
Back in the days
Way back in the days of the first dinosaur computers they had no screens, displays or monitors. Now, when we think about how a computer in those days presented information or a result we have to think ink, print and paper. No images, just text in a line by line format. When we progress and think about how the first monitor display looked like it was pretty much the same. Still no images, just text line by line because that was what they dearly needed. Instead of waiting for the print out they could see the results instantly on the screen. As an interesting side note: In programming languages we still have commands named "print" for sending data to the output device. The other word is "echo" for sending text to the screen.
Therefore the big question was: How do you display text on a screen? There are two problems to this story and what people forget. First and obvious for most people is how do you create a monitor and show text. That is not the real problem, though. The not so obvious question is the toughie: "How does a computer create text for display?" That second problem will lead us to the answer for resolution.
The big chessboard
The engineers at that time had to figure it out and when you think about what a computer does it is simple. The only thing a computer really does is turning things on and off. Take a chessboard and you know exactly what or where 3B is. Imagine a monitor screen with 8-by-8 squares and you can tell anybody to turn on or off the square representing 3B. Even a stupid computer!
On those first monochrome monitors you could in fact see the raster and those little squares. The letters had to be drawn in ways to work with these squares.
Since then not much has really changed simply because computers still only do one thing. Exactly, turning on and off things and a raster is maybe not the only way but most efficient way you can tell a computer to present something on screen. However, things have become smaller and smaller and of course, faster and faster. Oh, and the squares became a new name—Meet the Pixels.
Smaller is bigger
Along came Apple and they thought about how to present text on their first computer and monitors. At that time the squares became colorful and they also got smaller and smaller. In other words, the densitity of pixels got bigger on monitor screens. For their first monitor Apple settled for a density or raster of 72 pixels per inch (ppi). There are many informations on the web about how they came up with the number but to some degree it just does not really matter.
Your real estate
Despite the fact that screens are bigger today the density is roughly the same. What has changed are the dimensions; we kept the density but added some real estate, i.e. more bigger screens and with it more pixels to width and height.
Some people remember VGA (Video Graphics Array) as one of the first standards. Among other things the standard was known for its dimensions: 640x480. Followed by SVGA with 800x600, XGA 1024x768. Other interesting standards and numbers are HDTV (High Definition TV) with 1280x720 and FullHD with 1920x1080. These number are the total of pixels in a width (horizontal) by height (vertical) direction. An important and often overlooked fact is that these numbers are the computer's display resolution; not the monitor screen's resolution. It is the resolution the computer (plus the graphic card/chip) are meant to produce and send to the monitor's screen.
If you like to know the resolution of the screen you have to look at the technical specification of the monitor not the computer.
Resolution belongs to computers
Not convinced? Lets step back in time once more. Nowadays we buy flat screen monitors and TVs. Until just a few years ago we bought tubes, CRT monitors. They worked a little bit different and the display resolution was something we changed at will. In the early days CRT monitors have been able to show a far better density than the resolution a computer was able to calculate in a timely manner. The time factor by the way is something we should not forget about the old days. The higher the resolution the more computing power is required to generate a fresh image; which is not really true in computer graphics, but that is not the point here.
With flat screens today we usually do not change the resolution. We (should) stick with the default settings. Why? Because, unlike the CRTs, they have those tiny little light bulbs side by side and line by line. If you would change the resolution to anything different than the exact number of little light bulbs some parts of the image would not look so great because a lot of information for the image submitted by the computer falls between two light bulbs, i.e. it gets cut off. Unless you have at least twice the number but we don't waste any thoughts on that.
Usually with flat screens we have a match between screen resolution and display resolution.
What really matters
What really matters is the computer's display resolution and the number of pixels in that particular resolution.
When you think about the giant screen at the Cowboys Stadium in Dallas you might wonder how many pixels that are. The pixels on that board are different for sure and appearently that gigantic screens corresponds to only 2423 x 1088 pixels. Now, are these actual pixels with the monitor screen? It does not matter because the only thing that matters is the computer and the resolution it produces.
The screen, monitor or whatever we want to call the device connected to the computer is irrelant in terms of the actual number of pixels or what we usually see and understand as resolution. The monitor only needs to understand how it has to display the resolution set and send by the computer.