0
Shares
Pinterest Google+

What is “DPI?” Is it contagious? Or “PPI?” Are there antibiotics for that? Isn’t that what I measure air with in my tires?

These terms can be pretty confusing. Our CTO Alex Theodore took the time to write up a pretty fantastic piece explaining the difference between them, and what makes an image “high-quality.” Enjoy.

Ever since digital cameras began entering the hands of everyday people, we have all been confused on what exactly all those technical terms like “pixels“, “resolution” and “megabytes” really mean at the end of the day, especially regarding the very relative term of “high quality” that we throw around with digital prints.

So, we’d like to take a few seconds for anyone out there that needs a quick and simple explanation that will hopefully get rid of some of the confusion.

First off, despite how we think of images sometimes, “quality” is simply a relative term and there is no real standard. While it is easy to point out bad quality, there is no civilian convention for quantifying high quality. So to clarify the myth, “high quality” for the common person, really is in the eye of the beholder. 

Dollarphotoclub_22915992 copy

You might then ask, “well I thought DPI (dots per inch) was a measure of quality”? While thats partially true, its led to a lot of confusion because of misuse. So, lets begin with a quick run down of the two most common places you will see digital images:

After your digital camera or phone, your computer monitor is the first place where you look at your pictures. The images you see on your computer are the result of thousands of tiny colored and illumined squares that physically make up the screen. Those squares are called pixels – a term thats also used to describe other similar devices and uses.

Each pixel can produce a large range of colors and so when you stack all the pixels up, it creates a complete picture. The smaller the pixel, the less you see them and the more detailed the photo. Now, you want to say “high quality”. It would be logical to describe the number of pixels and their size by saying the number of “pixels per inch” – but (for whatever reason) that just never became the norm. Somehow in the end (thanks to some confusion with printers), we just got used to saying “dots per inch” or DPI. That convention works OK for a little while, but problems come in when other things that are very different from pixels are incorrectly also called dots and we also use DPI to describe them.

Printing is the exciting process of taking an intangible image from your computer monitor to something physical you can hold in your hand. There are many different types of printers and they all use different ways of achieving similar results.

By far the most relevant to common consumers is inkjet printing. The way a computer monitor creates temporary images is actually similar to how a printer makes permanent images: using a bunch of small dots of color. The best analogy of inkjet printing is to imagine a painter using an extremely fine paintbrush that gives great detail as it paints your picture one line at a time. An inkjet printer makes millions of small droplets of ink and places each drop carefully in near perfect rows to recreate the digital picture. Because there are many more possible colors than there are shades of ink, three or four basic colors can usually be combined to reproduce practically all possible colors. This brings us to an important point.

dotsvspixels

Often times these ink drops are also called dots and over an area of the print, the term for resolution DPI is used again just like pixels. But ink drops are different than pixels, because where your computer only needs one pixel to produce any shade of color, an inkjet printer needs three or four (or even many more) dots of different colored ink to reproduce one pixel that looks like the combination of the dots’ colors. The typical colors for basic printers are either Red Green and Blue or Cyan, Magenta, Yellow and Black. For this reason, you may hear of printers having wild claims of thousands of DPI in resolution that they can produce.

For example, if your home printer is using a three color system of Red Green and Blue (RGB) and printing out a digital file of 300 DPI, it will need to put down up to 900 dots per inch (3 colors x 300 pixels per inch) in order to recreate the original 300 DPI digital file.

So, another misconception to clarify: 300 DPI on your computer screen is not the same thing as 300 DPI on a printer. 

At the end of the day, studies show that our eyes cannot tell any difference in quality beyond 300-360 pixels per inch, so for the most part anything greater than that is a waste. However, for a printer, sometimes it takes even more than three or four ink dots to make a really good pixel, and so you will tend to see large multiples of 300 DPI such as 600, 1200 and even up to 2400 DPI and beyond. Generally speaking, the higher the better – but only for printing! (While the resolution of the file plays a big role in the final print, the printing technology can have as big or more of an impact on the final print quality.)

comparison

That brings us to the last bit of explaining you need in order to be a very well-educated printing consumer. What kind of image file do you need to produce a good print?

The confusion on this topic begins with the nature of a digital file on your screen. Where a printed picture has a fixed length, width and quality, a digital image does not. It can be zoomed in or out as much as you like – but the amount of information that the file has stays the same; it is simply being stretched or shrunk at your will. Digital image files are made of pixels, in this case the pixels are data that represent colors, not actual colored squares like on your computer screen. The term “resolution” is used when describing how many pixels there are in the file – both the height and the width. An image from your digital camera may say something like 3264 x 2448 (pixels wide by pixels tall) which means that the image you photographed was broken down into eight million pieces (multiplying width and height) to give you an 8 Megapixel file.

Generally speaking, the more pixels the better. But that doesn’t necessarily mean better quality – what it really means is that you can print the picture larger while maintaining good quality. For instance, a 2.0 megapixel camera will produce a print of the same quality as a 4.0 megapixel camera, if it is printed at half the size.

You may recognize that the most common file type for images is JPEG or “.jpg”. JPEG is a way of compressing the information in a file to reduce redundancy (such as all the identical green pixels in a grassy field). You can often get by with simplifying the image a lot before any noticeable quality difference is apparent. While the different degrees of compression can give different file sizes, the resolutions will always be the same. This is where the size of a file can sometimes  be a helpful way to help determine how good an image will look when printed.

Hopefully this helped you understand what the differences between DPI and PPI are and why they’re important to understand.

giphy (1)

Maybe just more questions about this post?

giphy

Let us know on Twitter

GIFs by Giphy

Comments

1 Comment

  1. September 15, 2015 at 5:32 pm

    How does printing on glass differ from printing on paper, and how does that effect softproofing and image?