The terrorists have not won
There is a tasteless joke whose punchline is, “well, we’ve established what kind of girl you are; now we’re just trying to establish the price.” It goes back to a newspaper column by the Hereditary Peer and reformed Canadian Lord Beaverbrook, it is probably fictional in origin, and it has been twisted around in a number of ways. Nevertheless, the quip is a great counterpoint to people who make a point of maintaining their photographic “integrity” by using some “less automated” form of digital.
Every technical aspect of digital photography (or as film snobs would call it, digital imaging) is nontraditional and somewhat automated. Light does not write an image on anything (we have the φωτός part; we have no γραφή). Instead, light hits an electronic sensing surface that translates light into analog measurements automatically, those measurements are converted to numbers automatically, and a computer in the camera bakes those numbers into a RAW image file automatically. That file is in turn transformed into something visible to humans, either in the camera or on a computer – and it is only in this final stage that human control returns, and it is a totally different type of control than chemical development and optical printing. The physics and chemistry of film photography are actually simple compared to the computational power required for digital photography. Put it this way: the oxidation-reduction reactions used in film photography are taught in high-school chemistry; the mathematical transformations needed to convert Bayer sensor measurements into recognizable images are almost graduate-school math. Or to put it bluntly: men went to the moon in vehicles with computers less sophisticated than what we now use to replicate the 1960s Hasselblad film cameras they took.
Functionally, digital imaging is like film photography in that you ultimately get an image on paper — but only similar in the way that a Selectric typewriter and a laser printer can both put crisp Courier text on a piece of white office bond. In both instances, you start with a keyboard and end with clean text, but the intervening operations are completely different. And with photography, both film and digital begin with a using camera and end with a physical image. But nothing in the middle is the same. That makes two things immediately suspect: (1) claims by manufacturers that their digital cameras build on their film competencies; and (2) claims by photographers that people should avoid using some of the possibilities that digital technologies provide. Leica culture is guilty on both counts. The easy part to identify is the design ethos of the digital M line: a digital M is designed to look like a film camera and not like a ground-up digital camera. This is understandable in light of the other part: the hard core of Leica culture thinks like Hesiod: there was a golden age (the M3), a silver age (the M2), and a progression of lesser ages that run up to and include the current product line (iron age is especially appropriate given Leica’s late penchant for stainless steel). Even among apostates who keep buying new Leicas (scribe, prepare the interdict!), technological resistance has historically expressed itself in apologetics. Leica zealots denounced autofocus — or autoexposure, or auto-advance, or digital, or whatever at the time of the denouncement Leica’s R&D budget had not yet allowed Solms/Wetzlar to implement. With autofocus, it was not entirely Pharisaic; even today, the only truly competent AF seems to come from larger, heavier DSLRs. But just as Paleo diets have captured the imagination of some, there is a set of rangefinder users who would like to go back to the days of the Kodak DCS line, when men were men and “chimping” referred to primates at play. Or better yet, they would like to return to the metaphor of the M3.
The Leica „M Edition 60” is simultaneously the fantasy and horror of Leica traditionalists. One group seeks continuity: an ersatz film camera suggests an unbroken line. Where that is not compelling, another craves “simplicity.” And yet others believe that omitting things like a screen would make a camera less expensive. A $20,000 camera package that is no lighter or smaller than a Typ 240 is going to sorely disappoint two out of these three groups. The acrimony is understandable. The remaining group might find suspension of disbelief easier. After all, Byzantine emperors still thought of themselves as Romans.
It is fair to guess that a camera made in an edition of 600 and packaged with white handling gloves will never sully its sensor with photons nor flush it with electrons. If it did, there would be legitimate questions of whether a digital camera, particularly a Leica one, is viable without a LCD screen and shooting only RAW:
- Shooting in DNG (i.e., RAW) is a poor substitute for proper exposure – and the Leica M meter has a tendency to produce results outside an easy adjustment range under a variety of circumstances: sunrise, sunset, flash. If the metering were more sophisticated on this camera, it might provoke less concern. But it’s fair to say that in tricky light, shooting the M architecture blind is not unlike exposing Kodachrome by guess. That, one assumes, is why the Typ 240 has auto-bracketing available.
- Lack of JPG capability can severely cabin on-the-road productivity and completely inhibits the use of Eye-Fi.
- Certain mixed lighting conditions that are relatively invisible to the eye (such as incandescent and daylight in the same frame) are detectable with an LCD, are correctable on-site at the time of shooting, and are extremely difficult to fix afterward.
- It would be a bitter pill to have a malfunction throughout a shoot that ruined the shots and was not detected until it was too late to make corrections. Think: rangefinder misalignment, a spot on the sensor, travel use.
In addition, some normal digital camera functions are completely dependent on the use of an LCD:
- Sensor cleaning is a stab-in-the-dark exercise without being able to look at stopped-down exposures quickly. And in any event, one would lose the dust detection capability of the camera.
- Lens profile selection becomes entirely dependent on Leica 6-bit coding.
- Filename/folder arrangements, formatting SD cards, and other “disk maintenance” functions make it hard to clear space if needed.
- Firmware updates would be difficult to implement.
And then there are some other things (normal features of digital and even many film cameras) that go away with the M Edition 60:
- Strap lugs
- Video (this is explicit)
- Liveview
- Histograms
- Self-timer settings
- Exposure bracketing
- Slow sync controls
- Auto ISO
- Frameline color
- Focus peaking
- Histograms
- Clipping detection
- USB mode controls
- Date/time setting
- User (settings profiles)
- Anything that has to do with JPEG generation (white balance, resolution, compression, film modes, color space)
There is no EVF workaround because the camera lacks an EVF port. So yes, as a digital camera, it is quite limited. These limitations may not have much effect on individuals shooting for pleasure. Theirs is no worse than the experience of shooting film, though the foibles of electronics inject a new element of risk. Photographers working in high-pressure contexts will not use something like this for the same reason they do not use medium format digital cameras: it is not the absolute disadvantage; it is the competitive disadvantage.
Functionality is a non-issue. Though a few perverse people will actually use the M Edition 60 to take pictures (just as one could use a silver dollar as currency), it is far from likely to be common. Leica’s replacement for the M Typ 240/M-P will undoubtedly have more technology, not less, and the superb industrial design of the „M Edition 60” will become a footnote like the M9 Titanium designed by Porsche or the M6J. Features of these special models may reappear (just as the high-magnification finder of the M6J and the LED-lit framelines of the M9 Titanium), but the whole package will not. The terrorists have not won; we can go back to screens and JPGs and video.
Leica, ultimately, wins here. It does not win on profit – a product in this low of a run barely pays for its own tooling. It wins in media exposure. Google “M Edition 60” and you will see that this device has put Leica on Engadget, DPReview, Wired, Forbes, CNET, and Petapixel. This puts the Leica line in front of a lot of people who previously did not know what Leica is – and more importantly, it puts the Leica brand in front of many people with disposable income. Not only does this represent a lot of free advertising for a niche brand, it is also likely aimed at selling more $7,000 M Typ 240s to people who don’t have $20,000 to drop on an M Edition 60 package.
Well played, Leica.
Twilight of the viewing filter
Among many other things that are fading away with film is the viewing filter. The Kodak Wratten #90 has long been the standard, though as a discontinued item, it is getting rare and expensive. The Zone VI mounted filter is long gone. If you get moving, you can still pick up the Tiffen Viewing Filter #1 ($40), which is a Wratten #90 laminated in glass and mounted in a phenomenally nice metal holder made in the U.S.A. (you cannot say so much for the velcro pouch). It is also only marginally more expensive than an unmounted #90.
If you read the casual descriptions, a “viewing filter” is something that “converts scenes to black and white.” That’s not exactly true; such a filter uses a dark color so overwhelming that your eye cannot easily discriminate the colors in a scene. The #1 filter, designed for black and white photography, is a very dark brown. It purportedly shows you a “normal” film response, which is something arbitrary (the look of a film really depends on your film and developer).Viewing filters come in other varieties and filter colors: they are (or were) also made for low- and high-speed cinema films and chroma key work.
But at a minimum, the device does show you where certain dark tones get muddy and where the highlights are. This in itself makes such a filter worthwhile – at least as a warning device. You can stick your black-and-white contrast filters in front of it (for example, a green filter to correct incandescent light), but it only works to a point – objects of complementary colors do indeed darken, but your eye quickly adjusts to acquire whatever color information it can, however weak.
As to the ready-made unit vs. unmounted gel issue, you might want the unmounted gel if your goal is to implant this filter into an existing accessory viewfinder. A Wratten gel is optically insignificant in terms of distortion, and because it is moisture-sensitive, it benefits from being inside a viewfinder unit (rather than the outside). A ready-made unit will be more durable and resistant to abuse, though it is just another thing to haul around (though you could attach it to the strap for your light meter).
Are alternatives available? Of course. You could go through a $2 Roscolux swatch book until you found something with a similar effect (though it might be a different color). Or you could find a set of old-school, bottle-brown sunglasses – that though not quite as dark as a #90, are quite helpful for visualizing black and white. And if you want to be truly perverse, you could set your iPhone to its black-and-white filter and use that as a visualizer.
# # # # #
Fix it now or fix it later?
For every photographic problem that might be addressed at the time of shooting, there always seems to be someone’s glib response that you “can fix it in post.” It is indeed possible to do many things with Lightroom, Photoshop, or GIMP – but is that the best or easiest way to do it? Let’s examine nine common correction operations, how they play out when shooting or in post, and which seems to be the better (or at least most efficient) option.
1. Perspective correction and leveling. Using a wideangle lens (<35mm) at anything but a dead-level position causes converging (or diverging) verticals. In the dark days before Photoshop, converging verticals were mitigated with PC lenses that shifted the lens relative to the film. This shifted the horizon and the effective viewpoint of the camera (10mm of shift compared to a 24mm frame height can move the horizon line more than 40% up or down). Older shift lenses had larger image circles to accommodate this, but they also show chromatic aberration on digital sensors – and they required inconvenient stopped-down operation for viewing and then metering. Newer lenses have electronically controlled apertures that help compensate for some of this. Correcting converging verticals in post-processing avoids the optical compromises and difficult metering, though the “warp” to the frame (which goes from rectangular to trapezoidal) cuts down the frame size, changes the effective aspect ratio of the picture, and compromises fine details if you’re starting with a low-res file. But the bigger problem is that most programs are not really capable of correcting perspective issues without distorting the vertical/horizontal proportions of the picture – generally making things look too tall. DxO Viewpoint has a ratio corrector, but it still requires visual estimation of a viewing angle that you never saw in real life). In terms of misery level, the easiest option is to get a wider lens, get as close as you can keeping the subject level, and simply crop as necessary. Time of shooting.
2. Vignetting control. Older lenses, especially symmetrical ones, often exhibit darker corners on digital sensors (they did on slide film as well, but on the negative film that most people used, this was less visible). Vignetting is a limitation imposed by physics. It also occurs with lenses designed for digital, but in many cases the camera can automatically compensate for a known lens when generating a JPG. At the time of shooting, when generating a RAW file, you basically have only a center filter as a choice. These very expensive filters impose big losses in terms of film speed (typically requiring 1.5x the exposure) and work best at smaller apertures. Even where there is no Lightroom profile for your lens, other solutions such as CornerFix and Adobe Flat Field allow you to shoot control pictures for repeatable corrections in the future – and to shoot with no exposure increase. Post.
3. Fill light. There are those who profess never to use flash and only whatever light is available. No one knows what they do with pictures that exhibit dark eye sockets, awkward shadows, and dominant light sources that point the wrong way. You can fix some of this in post, but simply the raising the exposure in certain parts of the image can make it difficult to maintain a natural-looking result. The major solutions here are to compose to face the dominant light source, use a reflector, or (heavens forbid) use fill flash. Time of shooting.
4. Light balancing (cooling). Low incandescent light provides unique challenges for digital sensors, almost all of which have noisy blue channels. Room light is typically pretty low, and the ISO setting on the camera typically ends up being pretty high, which means more noise across all channels. Using white balancing to compensate for reddish incandescent light exacerbates the problem in the blue channel by amplifying it even more. If you have a steady enough hand to do it, using a 80A (KB-15) filter drops the red and green channels so that the noisy blue channel is not unduly amplified. You lose 2/3 of the light doing this, but it cuts down on chroma noise. Time of shooting.
5. Light balancing (warming). The red channel does not suffer from the noise issues that the blue does. So it is ok to amplify it later. This in itself is not too compelling, but consider how at the time of shooting, warmer, no matter how warm, seems better – and yet in editing, things often look too warm. So consider limiting your filter filter use to an 81A (or KR3) and do any additional warming later. Post.
6. Red enhancement. The didymium red-enhancing filter has largely gone out of production (possibly due to demand and possible due to RoHS considerations). Its effect, which is to suppress “every other color” in the red-yellow range and then everything else past it, is extremely difficult to reproduce in post, if only because the peaks and valleys, occurring every 25nm or so, do not correspond with available adjustments to color in Lightroom (many of these actually fall between colors). Although it might ultimately be possible to reverse-engineer the effect, it would be a pain… Time of shooting.
7. Graduated neutral-density filtration. In color work, at the time of shooting, your only real choice to make the sky darker without a polarizer is a graduated neutral-density filter. The best versions are rectangular and allow you to rotate and move the horizon line. That said, they are much more unwieldy and flare-prone than circular grad filters, which are compact, easy to use, but completely inflexible in horizon line (midpoint of the gradient) placement. And with either, the hardness of the gradient needed is defined by the lens in use (oddly, only the rectangular versions offer a choice of hardness). Longer lenses require a harder cut. Provided that the dynamic range your scene permits it, the better solution is using gradient filters in Lightroom. These are variable for center position, rotational angle, and steepness of the gradient. In fact, they can be combined with other adjustments. The quality loss is minimal for simply darkening part of a scene; usually it is a relatively detail-free area like the sky. Post.
8. Specialty filtration. Softeners, diffusers, cross-screens, diffractors, and the like are filters for which there is no good Photoshop equivalent (assuming, of course, you are into the looks these filters create). Time of shooting.
9. Black and white tone adjustment. If you are into the effects of colored contrast filters on black-and-white film, you cannot very easily bolt such a filter onto a camera with a Bayer filter, because some filters (particularly red) can cause havoc with demosaic-ing. The Channel Mixer function in Photoshop (and Lightroom) lets you selectively raise or drop colors (at least within -20/+20) without too deleterious an effect on the image. The sole exception is the Leica M Monochrom, which having no color data to work with, must be filtered at the time of shooting. Post.
10. Correcting mixed lighting. Balanced fill flash falls apart any time that a flash is being balanced against something with a different color temperature. The most common problem is in room light, where at base ISO, flash essentially becomes the only light source, making the subject bright but the rest of the frame relatively dark. Raising the ISO tends to even out brightness, but it leads to pictures where the background is yellowish and the flash-lit subject looks normal. Although this can be corrected with a lot of work later, the easiest thing to do is to gel the flash with an 85A filter to make its light the same color as the room light. Time of shooting.
None of this is to say that there is anything wrong with post-processing digital images, and in fact, some things can only really be done digitally (fine-tuned and synchronized white balance, distortion removal, sharpening, etc.). But it is to say that a little more care in shooting can cut down on the time and frustration involved in post-processing.
# # # # #