Pages

Saturday, August 24, 2024

Confusing Concepts - Resolution

VOLUMES UPON volumes have been written about this topic. I am not for a moment trying to convince you that I am either an expert or the proverbial "last word." In recent weeks, I have read a few comments here, and online in other blogs (mostly in the discussion and comments) that seems to me to underscore a lack of complete understanding of the terminology. What motivates this blog (and a couple more to follow) is the thought the maybe I can shed some - albeit elementary - light on these topics. This is the first of a 3-part series.

PART OF the confusion probably stems from the fact that there are actually different kinds of "resolution" when we apply it to photography. It is a rather broad term, which is often used imprecisely. The making of a photographic image involves a lens, a medium - these days mostly a digital sensor and resulting file, and a manner of display. Each of these components applies a different "spin" on the word, "resolution." Consequently, when we are addressing resolution, we need to understand what kind of resolution we mean.

Sensor Size Comparison

THE RESOLUTION of a particular camera lens (or its "resolving power), simply refers to its ability to resolve detail. There are numerous factors that effect this ability, including lens design, size and quality of the glass elements, coatings, etc., 

IN THE case of digital cameras, resolution refers to the sensor used to record the digital image. This component of the optical "system" is perhaps the most difficult to get one's arms around. Sensors are intricate mechanisms. On a rudimentary level, they seem simple enough. They are just a collection of electronic recording sites (known as photo sites) grouped together on the camera sensor surface. They are, of course, microscopic in size. A more in-depth look at sensors leads us to realize that things aren't as simple as that sounds. Two significant factors are the size and number of individual photo sites. It seems evident enough that a smaller sensor will not be able to hold as many of the same-sized photo sites (or photo cells) as a larger sensor. Sensor size is functionally related to the lens circle. Smaller lenses will only "cover" a smaller sensor area. As the sensor gets larger, in order to cover the sensor area, lenses must be designed with larger circles. The reason lenses are circular is really beyond the scope of this article (and my expertise, 😰), but it is a matter of physics, and the desire to balance the light being directed by the lens. If you use an image sensor that is larger than the image circle, the image will show up framed as a circle encompassed by a black area outside the circle. As a general rule (we will see as we go on, that these things don't work independently), smaller photo sites will have less "resolving" power than larger ones. Coupled with the concept of an optical occurrence known as diffraction (stay tuned), conventional wisdom has it that smaller sensor - based cameras will generally have less resolving power than larger ones. While not precisely correct, it is a valid consideration when using such equipment. I have only recently empirically tested (and concluded) that this applies to my m4/3 camera setup as compared to my "full frame" sensor gear. The rationale for this line of thinking is that it is difficult to match photo sites in terms of both number and size on a smaller sensor. My Olympus m4/3 sensor is nearly 1/4 the size of my Sony a7Rii "full (35mm equivalent) frame" sensor. At only 20 megapixels (a measure of the number of photo sites on the sensor), it is the largest m4/3 sensor available, to the best of my knowledge. My Sony, on the other hand, is 46 megapixels (and the newest iteration - the a7Rv - is 61 mp). Not only are there from 2 -3 times the sites, but each individual photo site is also significantly physically larger. That phenomenon creates conditions for increased sensor resolution.

Bayer Color Filter Array

THERE IS more to the sensor story than photo sites and sizes though. Diffraction plays a significant part in this equation, too. I will cover diffraction all by itself in the next post. During the early years of digital sensors, one of the concerns that designers (and users, of course) had was the phenoma of "aliasing." As we have discussed here in the past, the basis of a digital image is the "lego-like" stacking of rectangular pixels to produce the shapes found in images. Because of these individual pixels, there are always straight line transitions between pixels (to continue the analogy, each lego block). At some level - particularly in lower "resolution" (in this case meaning smaller and less megapixels) images will have the appearance of jagged edges (or "jaggies"). In order to address this concern, camera manufacturers put an anti-aliasing filter in front of the sensor (known as a "low pass" filter), that was designed to introduce a bit of blur. Obviously, my explanation is hopelessly oversimplified and the process is/was complex, if not consistent. As they added megapixels (my first Nikon D100 was a 6mp camera), and processing software (especially raw conversion engines) got better and better, the aliasing issue has become less important. Indeed, I have personally looked for camera specifications that do not have the low pass filter, reasoning that I don't want anything I don't absolutely need to introduce softness. On the contrary, I am looking for the maximum sharpness I can get. In my view, the presence of an AA filter - though perhaps only very marginally - effects resolution. Neither of my current cameras (Sony a7rii and Olympus EM10iv) have AA filters on their sensor.

Why are the pictures square if the lens is round? - (Steven Wright)

 A SECOND filter (or filter array) known as a "Bayer Color Filter Array" is placed in front of the almost every digital sensor. Through a process called digital sampling, the sensor creates the digital image. The Bayer filter involves additional color sampling, which produces the colors in our images, using primary colors of red, green and blue. The sharp observer will note that there are (many) more green sensors than red or blue (see the illustration above). This is because our visual system is the most sensitive to the green light spectrum, which is where the sun emits the largest amount of light. Green light contributes much more to our perception of luminance. Color filter arrays are designed to capture twice as much green light as either of the other two colors. The takeaway here is that the Bayer filter is yet another path of interference between the light rays and the sensor sites. This introduces softness and therefore, effects resolution. This also explains why most raw processing software has a "default" amount of sharpening (often referred to as "capture sharpening") that is applied automatically to a raw digital file. Most software (Adobe ACR does) allows the user to adjust, or even eliminate, that default sharpening.

all of these individual measures of resolution work together to create the end product

THE LAST of my three resolution considerations is the manner of display. For many years, the primary method of display was the print, on a photographic fiber medium. This involved some kind of pigmentation process from being embedded into the medium (traditional photographic darkroom printing) to printing press ink printing methods to the more modern digital inkjet printing. By the time of the latter, we were also commonly projecting images onto a cathode ray type tube (CRT), and eventually, LCD screens. Prior to the emergence of digital, another method of displaying images was through what was called a color-transparency system (or simply slides). Each of these presentation methods react differently in terms of resolution. The medium itself has its own "resolution," which - once an image is put in the form of the presentation, becomes the predominant factor. With the ascendancy of social media, smart phones and tablets, it is probably safe to say that digital projection is the most common manner of presentation today. Resolution in the context of presentation media, has begotten perhaps one of the most confusing terminology puzzles in the realm of resolution. Resolution of an image when projected on a CRT/LCD screen is purely electronic and is often measured as pixels per inch (PPI), a measure of the size and density of the displayed image. When we speak of an inkjet print however, the printer uses colored pigments to create a microscopic dot-based pattern on the medium. The correct resolution terminology here is "dots per inch" (DPI). DPI is also used for traditional printing-press type media presentations. The two (PPI/DPI) are often - confusingly - interchanged.

AS I said earlier all of these individual measures of resolution work together to create the end product. When choosing and using camera gear, an understanding of these factors will make more sense out of your choices. When the hype from the seller, or the specifications from one of the testers out there emphasizes the particular component's "resolution," or "resolving power," it is important to think about the other components. The highest quality (think Leica or Zeiss) lens, with a "medium - format" sensor (or larger) camera (and yes, confoundingly, in the digital world, MF is bigger than "Full Frame"), that is going to be only seen on your FB or Instagram page is extreme overkill. The final digital resting spot for the image cannot begin to match the resolution of the other two components.

Resolution in the context of presentation media, has begotten perhaps one of the most confusing terminology puzzles in the realm of resolution

I  AM not saying you shouldn't have high quality or high resolution equipment. I am saying that an understanding of resolution and its significant variability will help put your photography - and your gear needs/wants in perspective. "Pixel peeping" is a (sometimes pejorative) description that is given to a lot of photographers these days who tend to place an over-emphasis on technical factors, like resolution, noise (see, What's All the Noise about Noise), and diffraction (another term I will cover in an upcoming blog), over the more artistic part of photography. To be sure, some fundamental skills and reasonably good quality equipment are required to make sure the image is going to be viewable as intended. But beyond that, in many cases, the technical issues tend to be overblown, in my opinion.

WE STILL haven't told the whole story though! Stay tuned for upcoming blogs on Diffraction and Image Sharpness.

No comments:

Post a Comment