Clayton's Notes

AI and Photography

Once upon a time, to be a photographer meant a very serious investment in the tools of photography. To be a photographer, you not only had to buy all of the equipment to capture and develop photos, but you had to understand types of film, lighting, exposure, aperture, composition, and the film development process. You had to understand all of these things, or what you produced would be a blob. It just wouldn’t work.

Film development shops sprung up, and it got easier. Suddenly not just enthusiasts or professionals could make images, but anyone. Auto-focus lenses made it even easier. The barrier to entry was lessened. And then digital cameras came out, and once they got cheap and good, suddenly anyone could make an image, anytime. Then, they got miniaturized into cell phones, and it got even easier.

You don’t have to know about lighting or exposure or aperture or anything else today, to make an image. You just click a button while pointing a tiny rectangle at the thing you want to make an image of, and it’s done. And then you send it to your friends, not understanding along the way how that image is translated into a series of 1s and 0s and sent across the town or country or world to get to someone else (what a marvel!).

Nevertheless, professional photographers still exist. To reliably make impactful and stirring images, you must study photography at least a little bit. And the true masters of photography still understand nearly everything that a photographer of 100 years ago would, barring things like the vagaries of film development, which are simply not relevant anymore unless you belong to the tiny minority of photographers still working with film. They have to understand these principles, if they want to make good images. The fact that your camera can auto-focus and determine what it thinks is the optimal aperture and shutter timing is convenient and can save you time, but you still have to understand how they work, even though you often no longer have to manually configure these settings yourself each shot. You certainly still will manually specify these things for certain images, to get it exactly the way you want it to look.

However, while there has been a Cambrian explosion of photographs being taken, most of them are crap. Neither impactful nor stirring in any way. In fact, it is likely that the majority of them will never even be seen again by their maker, lost in 20 other shots of the same scene.

Too, many of the photos being taken aren’t taken to be impactful or stirring. They are blurry snapshots of your friends at the bar that will make you and you alone laugh next year when you come across them. Or they are pictures of a receipt that you wanted a copy of, or a picture of the sprinkler system junction box before you took it all apart to fix a leak. Personal photos that only you or perhaps a few other people will ever care about. It does not matter that you do not understand lighting, composition, aperture, etc., because after you have completely borked your sprinkler system re-assembly, you can go to your phone to see how it was all supposed to go together. Wonderful!!!

There are certainly a higher number of talented photographers than there were 50 or 100 years ago, even adjusting for population growth: the bar to take it up is very low! It is much easier to get hooked. But the ability for everyone to take images does not mean that everyone can be a professional photographer. It still requires a lot of effort to learn the craft of photography. Nor has it made obsolete the profession of photographers. There are still many professional photographers.

It seems that software development is going through a similar movement right now. ChatGPT and its ilk can make software for anyone, at a simple prompt. Real software, that real people can use. I’m going to propose a few logical conclusions to this, looking to the evolution of photography as an example.

Anyone will be able to make software.

There will be far more software-makers than there were before.

Most of it will be crap.

Most of the time, that won’t matter, because it will be just for them, or them and their family or friends.

Professional software developers will not go away. There may be even more of them than there were before, and a greater demand for software than ever before.

As a general rule, professional software developers will still need to understand the principles of how their machines work to create understandable, reliable, and impactful systems. Of course, random pieces of software will be brought into being by complete incompetents that will still go on to be wildly successful, the same way there are random images which are in no way “good pictures” that go viral and which a huge % of the population will see and remember.

Professional software developers may not need to understand the arcane details of their systems as much as they used to, just as most photographers today do not need to understand how to develop analog film. But they still will need to understand the fundamental principles of computers and software systems, especially if they are contracted to deliver software that does consequential things.