the scoop on digital imagery. . . .
shadowfax, on host 206.191.194.169
Tuesday, December 5, 2000, at 06:22:59
computer question posted by Don Thornton on Tuesday, December 5, 2000, at 04:56:20:
> My wonderful wife, Ticia, tells me that you people know everything there is to know about computers, or that you know people who do. So I thought I'd see if I could get some free customer support :) > > (On a side note, speaking of Ticia--you thought we were speaking about computers!--I was just made aware of the time by my wife. Every day, between 5:40am and 5:45am I hear "I'm sooo tired..." That's when I know that her car pool is just a few minutes away. She is so consistant! I love it :) > > Uh, oh! She just read this and gave me a serious "Humph!" sound. That means she's piqued. > > Ok, where was I--Australia! > > Here's the deal: I just discovered the joys of digital picture resolution. I found that instead of just changing the size of a picture, I can change the number of pixels per square inch. And of course, when I find a good thing, I tend to carry it to extremes. I thought "If 300 p/i is good, then 1000 p/i ought to be fantastic!" But then it occured to me that if my printer is less than stellar, it wouldn't matter how awesome my picture resolution is. > > Ok, I think I've almost worked myself up to asking the question. I found out that my printer has a resolution of 600 dpi (dots per square inch?). How does this compare to pixels per square inch? Is a pixel it's own dot? Dot's pretty much all I want to know for now.
OK, here's the scoop:
600 DPI is about the upper limit for anything you want to print unless you're talking about super high quality national geographic-style pictures. This is true for 3 reasons:
1) The human eye won't tell the difference between 600 dpi and 1000 dpi unless it's really high quality paper
2) paper sucks. Unless you get that horridly expensive $1/page stuff, your paper really won't support more than 600 (and usually won't support more than 300) dpi. Why? Paper has tiny little holes in it. You can even see 'em with the naked eye. Inkjet printers spray ink onto the paper, which then gets lost in these little holes, distorting the results. You don't really notice it because the dpi isn't high enough to make it noticible, but if you go setting it to 1200 dpi like a lot of people do, then you'll start noticing that it doesn't look any better than 300 or 600 (depending on how cheap your paper is).
3) file size and load time. The lower the dpi, the lower the file size. When I do graphic design projects where I need the resolution up around 600 or 1200 dpi, I routinely end up with 20, 40, and even 100 megabyte files. This is somewhat of a pain even on a relatively fast machine like mine. It's also why I have 50 gigs of harddrive space ;)
So what dpi is good?
75 for web publishing. Most monitors won't display more than 75dpi anyway, so what's the point in upping the resolution? Also, there's that larger file size again - - - you don't want your web pages to slow down just because you have useless extra resolution. The exception to this would be if you publish something that you expect people to print out.
300 for normal printing. You should never go higher than 300 for text-only documents -- -there would be no point because 300dpi is about the limit of the human eye to tell the sharpness in text.
600 max for picture printing on normal paper
1200 max for picture printing on photo paper.
Here's a hint for ya too. Use laser printer paper in your inkjet. Laser paper has fewer pores than inkjet paper. You'll get a higher printed resolution with less bleedthru than you would with an inkjet.
dot pitch is a measure of how many dots per inch (DPI) are printed.
as for a pixel, here's the dictionary definition:
Short for Picture Element, a pixel is a single point in a graphic image. Graphics monitors display pictures by dividing the display screen into thousands (or millions) of pixels, arranged in rows and columns. The pixels are so close together that they appear connected.
The number of bits used to represent each pixel determines how many colors or shades of gray can be displayed. For example, in 8-bit color mode, the color monitor uses 8 bits for each pixel, making it possible to display 2 to the 8th power (256) different colors or shades of gray.
On color monitors, each pixel is actually composed of three dots -- a red, a blue, and a green one. Ideally, the three dots should all converge at the same point, but all monitors have some convergence error that can make color pixels appear fuzzy.
The quality of a display system largely depends on its resolution, how many pixels it can display, and how many bits are used to represent each pixel. VGA systems display 640 by 480, or about 300,000 pixels. In contrast, SVGA systems display 1,024 by 768, or nearly 800,000 pixels. True Color systems use 24 bits per pixel, allowing them to display more than 16 million different colors.
|