• Please review our updated Terms and Rules here

Why do digital cameras typically have higher pixel resolution than the PC monitors available at the time?

computerdude92

Veteran Member
Joined
Dec 10, 2014
Messages
1,059
Location
Alaska
I've always wondered this. Sorry for my ignorance, but why do picture cameras have higher pixel resolution than the computer screens of their era?

For example, 1920x1080 has been a very common high-end screen resolution of a PC monitor for many years. So why did they make cameras with resolutions of 3648x2752 (10 megapixel) back then if the image can't even fit on the screen?

Image viewer software is needed because Microsoft Paint simply cannot view the image in it's full size. I have to shrink the image down to 1920x1080 or smaller to be able to use the image in my operating system.

Thanks to anyone who can help explain this for me.
 
Because digital cameras take pictures meant for printing. Viewing them on a PC is not the primary purpose (well, it might be today).

Since you print photos with 300 to 360 dpi, you need such pixel counts (e.g. 2 MP for a small print, but 8 MP for a large 20x30 print). Also, it often allows cropping without sacrificing quality, as you will still have more pixels left than needed for printing.

Apart from that, pretty much every single digital camera will let you choose to save pictures with a lower resolution.
 
Are you talking high end cameras? Because I have had digital cameras since the beginning. I remember my mid 90's Sony with no removeable storage and a serial cable link had pictures of VGA resolution. As they got better they were pretty on bar if not below the support of monitors. I am no photographer so my cameras have always been Consumer grade.
 
A major advantage of having an oversized image is the ease of cropping. Cropping an image already at the correct size and then blowing up the result yields a blurry image.

I don't know what current production standards are but back when I worked on a newspaper, the printed version of a photo only used about a third of the original.
 
Image viewer software is needed because Microsoft Paint simply cannot view the image in it's full size.

I'm having some trouble here determining if this is for real or a troll. Microsoft Paint has always been essentially just a demo program not intended for any serious task. I mean, sure, it's probably enough for some people; NotePad is also probably more word processor than a fair number of people really need, but generally speaking...
Are you talking high end cameras? Because I have had digital cameras since the beginning. I remember my mid 90's Sony with no removeable storage and a serial cable link had pictures of VGA resolution. As they got better they were pretty on bar if not below the support of monitors.

Early digital cameras used similar sensors to the first generation of CCD camcorders, so sure, their resolutions started out pretty much on par with NTSC/PAL television. Which was also apropos because once you take a picture you need to store it some kind of memory; an uncompressed full color 640x480 photo takes almost a megabyte of RAM to hold in a framebuffer, and even when compressed with to a very lousy, lossy level of quality you're not going to be able to fit more than a handful on the one or two MB of EEPROM/Flash those early cameras had. Considering your average computer monitor was somewhere in the 800x600 -> 1024x768 ballpark by the mid-90's, yeah, digital cameras were a little below the typical monitor resolution for a while...

But *real* digital cameras broke this threshold a *long* time ago? (Not talking about things like the front-facing cameras in cell phones and whatnot, which are still running at "roughly monitor-level" resolution. A "relatively cheap" Canon point-and-shoot like an A50 had a 1.3 Megapixel sensor, which makes it equivalent to a 1280x960 monitor, which was "pretty good" for the Windows 98 era; by 2004 a similar pocket camera rocked a 4 megapixel sensor. 2004 was a little before we all went widescreen on our computer monitors, so to put this in perspective a 1600x1200 monitor, about the highest common resolution 4x3 monitor, is only a hair over two megapixels. By 2008 we're talking eight megapixels for the same product category. This is about as many pixels as a 4K monitor or TV set, but those weren't exactly common in 2008. This kind of camera is of course almost dead because of how good high-end cell phones have gotten, but FWIW 2017's model was 20 megapixels. "Monitors" with resolutions higher than this do exist, but mortals can't afford them. (We are talking about IMAX digital projectors here.) And these are pocket point-and-shoot cameras, not high-end models.

TL;DR, to have a camera with a resolution lower than your monitor you've had to work pretty hard finding it for a while now.
 
Oh, I've still got my 640x480 digicam which takes a 2MB CF card.

One thing not mentioned is that "resolution" is a squirrely term when it comes to digicams. Sometimes the resolution is given as the result of adding additional pixels by interpolating adjacent pixels. Yes, it's a cheat. You often see this in webcams, where the resolution is quoted as 1080p, but the camera itself isn't anywhere near it.
 
On this question it's really worth noting just how "stuck" typical computer monitor resolutions were for a long time. "Megapixel" was a term that that was being bandied about by the early-mid-80's in reference to workstation class computers, and by the end of the 80's it wasn't *that* unreasonable to have a 1024x768 monitor (which is technically only 3/4s of a megapixel, but it's getting to the ball park) on a high-end computer. (Although that kind of resolution wasn't really *common* until the mid-1990's.) And then... it stayed stuck there for a long, long, long time. Most laptops were shipping with 1024x768 screens right up to when they started going widescreen in the mid-aughts, and after that the standard resolution was just a slightly wider 1280x800-ish. And desktop screens likewise weren't much higher; the standard-before-widescreen 17" LCD was 1280x1024, and those mostly got replaced with a motley mix of 1440x900 and 1920x1080 screens.

Unless you specifically go out of your way to get a HiDPI screen you're probably only rocking three or four times as many pixels as a typical late-1990's computer. When you consider just how much faster/bigger everything else on the computer is than one from 25 years ago, well, it's kind of understandable why photos from a modern digital camera would swamp an old machine that's not designed to be able to scale bitmaps of that magnitude in real time.

As to *why* pixel counts haven't changed that much, I think the glib answer is that evolution has been dragging its feet in giving us higher resolution eyes. Depending on how far you're sitting from it we start hitting diminishing returns once you get much over 150 DPI or so. As has already been noted, the advantage in building the *camera* so it can acquire so many more pixels than we "need" is that oversampling gives you the data you need to do digital zooms, crops, etc, etc. It's actually pretty awesome how if you have enough pixels in your sensor you can shoot a photo of a bird or something that's just a little speck in terms of the whole frame but actually makes for a decent web photo when you crop it down.

Oh, I've still got my 640x480 digicam which takes a 2MB CF card.

I stuck with film point-and-shoots until around 2003 or so, when I bought a cheap Fuji FinePix that took those really thin (even by today's standards) "SmartMedia" cards. Think I had 16MB and 32MB ones? That thing was one of the cheapest "not obviously garbage" digital cameras you could buy at the time and even it did 1600x1200. (You could turn the resolution down, though, if you wanted to stuff more pictures onto a card.)

Not to say you couldn't still buy a VGA resolution (or lower) digital camera if you really wanted one. I think I have one with a keychain loop sitting in a junk box somewhere, around the same vintage.
 
Both monitors and cameras are available in a wide range of resolutions and I don't see any real connection between the two, but it should be noted that a 'pixel' isn't even the same quantity in this comparison. One pixel on a monitor represents red, green, and blue. On a camera sensor one pixel is (with few exceptions) capturing only one color component, ie. red, green, OR blue. The missing data in the final image is interpolated somehow. So one might even say that a 2MP monitor is equivalent to a 6MP camera.
 
Back
Top