Leave off tilt-shift
The Instagram app is infamous for its faux-vintage filters. But it has another faux function: tilt-shift. It’s a term borrowed from old-school photography. WIth big cameras, special lenses and techniques would allow you to selectively focus on a particular part of an image in the distance. Normally you can only focus on something relatively near you. A classic use is to make a scene look like something from a miniature village. We can normally (with our eyes) only see one thing in an scene blurred and one thing sharp if one of the things is close to us. Tilt-shift allows us to take a picture of two things that are both in the distance and have one blurred and one sharp. So it tricks our brain into thinking that one is actually close to us. As happens when we look at a miniature village.
People complain about everything to do with Instagram. And tilt-shift was recently added to the list in a recent Guardian article along with the more obvious (yawn) faux-vintage filters. Some people take exception to any manipulation of an image after you take the photo. Somehow it’s OK before you take it (as you do with a DSLR) but not after. I say there’s no difference. Every time we take a photo, we produce a version of reality - we’ve manipulated the image in advance. Show me a camera that takes a picture that is exactly true to reality. It doesn’t exist. Even if it did exist, you’d still have to place a frame on reality. So you’ve still selectively chosen the part of reality you want to show. Also, we humans have peripheral vision. Look at something. It’s focussed, you can see it in detail. But you can also see things in your peripheral vision. But those things are blurred and indistinct. Remind you of something? So could it be argued that tilt-shift is actually a closer representation of reality than a straight photo? But of course it isn’t, because nothing is.
Closer to reality?