About bjoernhirschphotography

Photography is a love affair with light: Traveling, Living in Paris, Portraits and for good measure some historical cars. All Photos are mine.

The Selfie as We Know It Is Dead

The duck face, the fish gape, the smize—these are just a few of the time-honored poses that celebrities, influencers, and the Instagram-happy masses have relied upon to create perfect selfies. But a lot has changed since the early aughts, when people first started training their smartphone lenses on themselves. Today, selfie-takers can achieve poreless, doll-like symmetry through feature enhancing apps like FaceTune, or they can hire on-demand photographers through ElsiePic to capture their adventures for them so they can remain “in the moment.”

But is a selfie still a selfie if someone else is taking it for you? Is intimacy lost when your look is digitally modified, or is that just better living through technology? Somehow paying a photographer to art direct your life feels antithetical to the spontaneity that was once associated with #iwokeuplikethis or even the much-maligned bathroom mirror pic. Could it be the selfie has come to an end? Kim Kardashian West seems to think so. Yes, the woman who once released a coffee table book of selfies, has concluded that, in her professional opinion, the trend is basically over. Data from Google Trends has also shown a steady decline in the keyword since it was added to the dictionary in 2013.

Want further proof that the selfie is a thing of the past? The artform, like so many relics of antiquity, is now in a museum. The Museum of Selfies, currently on view in Glendale, California, is an interactive exhibit documenting the rise of the selfie and perhaps its ultimate demise. But Tommy Honton, co-curator of the museum, thinks that despite Kardashian West’s proclamation, the selfie is still alive and well. “Selfies are just another form of self-portraiture, so saying the selfie dead is like saying the era of photography is over,” Honton says. And his exhibit is proof of that, inviting visitors to look beyond the assumption that the selfie is a symbol of narcissism and instead see it as a form of artistic expression.

Cultural critic Negar Mottahedeh takes it a step further, saying that even more than a bit of digital vanity, the selfie is “a networked object that connects us with others beyond our physical environment through an online collective.” She should know; Mottahedeh teaches a class on the subject at Duke University that focuses on the global history of portraiture since the 19th century as well as our desire to document the ordinary. And in addition to making a record of the everyday, she notes the selfie has played an integral role in citizen journalism during events like the 2009 elections in Iran and the protests that gave rise to the Arab Spring.

LEARN MORE

The WIRED Guide to Emoji

Mottahedeh, who is also a member of the Selfie Research Network, explains that as corporate influence weakens social media’s capacity to create networks of resistance or solidarity amongst people, its power as a useful tool for popular politics is diminished. For her, that means the power of the selfie is dwindling too. “In the early days of the selfie it appeared as if two forms of representation were being democratized, that of the portrait and that of the proxy. I don’t find that it carries that possibility anymore,” Mottahedeh says. She hopes that Facebook’s Cambridge Analytica scandal will bring attention to the forces behind our newsfeeds. “We need to be very aware that each click, like, and comment we make is a signal to those in power, be it corporate or governmental,” she explains.

Mottahedeh sees the younger generation embracing meme culture as the political intervention of the moment, recontextualizing politics through viral images and bringing them into the real world through the protest posters of marches and demonstrations. Conversely, those images are photographed by citizen journalists and then redistributed across social media platforms.

This is where Honton believes the selfie still has value, that it’s still the most accessible way to capture the immediacy and intimacy of the individual’s everyday experience. The goal of his temporary museum, which runs through May, is not only to include the selfie in the history of photography, but to subvert the idea that we’re just living life through the interface of our phones. “Selfies are powerful because they let us author our own stories,” Honton says. As for the future of digital self-portraiture, he believes the selfie will live on. For Honton, “even when we’re living as virtual avatars of ourselves a la Ready Player One, we’re probably still going to want to take a selfie of that experience as our virtual selves.”

The selfie may be over, but it will never truly die.

More WIRED Culture

  • What does “self-care” mean amid the barrage of news and social media?
  • The strange history of one of the Internet’s first viral videos
  • Where did the word ‘doggo’ come from? Wouldn’t you like to know, fren

via Feed: All Latest https://ift.tt/2HRCHEg

Advertisements

Why You Should Consider Using Lightroom Mobile on the Go

Lightroom Mobile continues to become a more capable companion to the desktop app. This great video examines the benefits of a mobile workflow and how it can make your life easier when you sit back down at your computer.

Coming to you from Ted Forbes of The Art of Photography, this helpful video shows how using Lightroom Mobile to import and edit files can make your workflow more efficient and enjoyable. I was a fan of the iPad Pro in my review, and it continues to be a big part of my mobile workflow; in fact, with the great screen and tactile experience, I prefer working on a tablet a lot of the time. It saves me a lot of time and lets me get a head start on culling and editing. Here’s an additional tip: you can set Lightroom to automatically download images from the cloud to wherever you want, so if you keep your photos on an external drive, you can make sure they end up where they belong. To do this, simply go to Lightroom preferences, Lightroom sync, then check “Specify location for Lightroom CC ecosystem images” and choose the location. Also, note that since you’re uploading raw files to the cloud, this method works best when you’re somewhere with a relatively fast Wi-Fi connection.

via Fstoppers https://ift.tt/2JdU2GL

Quick Tip: How to ‘Auto’ a Single Slider in Lightroom

Lightroom Classic has long had an “Auto” feature in the Develop module that will automatically set basic sliders for you based on the image at hand. But did you know that you can now “Auto” set individual sliders?

This simple but useful trick is discussed and demonstrated in the 44-second “Lightroom Coffee Break” tutorial above by Adobe. Benjamin Warde shares how the basic Auto system has been revamped in Lightroom Classic version 7.1 to more intelligently auto adjust your photo to give you a solid starting point for your edits.

But in addition to automatically setting the values for all sliders, you can select individual sliders by holding down Shift and then double-clicking the label for the slider you wish to intelligently Auto set.

Auto setting Whites. Holding shift changes the “Reset” button to “Reset (Adobe)” (center screenshot). Double-clicking “Whites” Auto sets the value (right screenshot).

This is a simple way of letting Photoshop intelligently suggest values for some aspects of a photo while you keep others under your sole control from the beginning.

via PetaPixel https://ift.tt/2JKpNbt

Using the Auto Settings in Lightroom Classic CC

Using the Auto Settings in Lightroom Classic CC

Lightroom tips and tricks in 60 seconds or less from longtime Lightroom team member Benjamin Warde.


About Benjamin Warde:

I live in San Francisco, work for Adobe on the Lightroom team, and am an avid hobbyist photographer. Let’s Get Connected:

500px.com/bengeance

Text, image and video via Adobe Photoshop Lightroom


via ISO 1200 | Photography Video blog for photographers https://ift.tt/2qvAAOd

Dunning-Kruger Effect: Why you’re not as good a photographer as you think you are



Have you ever noticed that as you learn more about the world of photography, you tend to realize just how little you actually know? This phenomenon is what’s referred to as the Dunning-Kruger effect.

London-based photographer Jamie Windsor recently took to his YouTube channel to explain what it is, how it affects you and your work and even shares five things you can do to overcome thinking you know more than you actually do.

A chart from the video showing how perceived ability compares with actual ability according to the Dunning-Kruger Effect.

As explained in the video, the name of the phenomenon came from two social psychologists David Dunning and Justin Kruger. During a study, the two recognized that the less competent someone was at a given task, the better they thought they were. Put more simply, if you think you’re a great photographer, there’s a good chance you’re not nearly as amazing as you think you are.

Almost everyone falls victim to the Dunning Kruger effect at some point in their career. But the more self-aware you can become, the less likely you are to fall into the trap of being a bad photographer who thinks they’re good. To help combat this downward spiral, Windsor shares a few tips, which we’ve paraphrased and elaborated on below:

  1. Beware of feeling comfortable – If you start feeling comfortable in your abilities, try something new and expand your horizons. Don’t get complacent.
  2. Learn to let go of old work – Always try to one-up yourself and make your next shot your best shot. If you still think that shot from four years ago is your best, you probably haven’t improved much.
  3. Ask for feedback and constructive critique – It’s not always easy to hear, but an outside perspective can help you get a broader and more realistic view of your skills and ability.
  4. Always keep learning – “You have never learnt everything.” Never think you’ve finished learning something—everything is a rabbit hole of knowledge.
  5. Feeling bad about your old work is a sign of progress – Thinking your old work isn’t great means you’ve learned where you’ve fallen short and know how to improve your work.

In the end, Windsor emphasizes that no matter what you think of your work or how far you’ve come, it’s ultimately about enjoying the ride. His parting piece of advice is to ‘learn why you’re doing things, not just how to do them’.

To find more videos, head over to Windsor’s YouTube channel and subscribe.

via Reddit




via Articles: Digital Photography Review (dpreview.com) https://ift.tt/2qrS6DI

10 Hidden Lightroom Features That Can Change Your Editing Life

Sometimes we can use software for years without taking the time to dive deeper and unlock its full potential. Here’s a great 12-minute tutorial by photographer Jamie Windsor with 10 hidden Lightroom features that can help your editing and speed up your workflow.

Even if you feel like a seasoned Lightroom user, there may be some tips and tricks among these 10 that can help improve your photo editing life.

Here’s a quick rundown of the 10 things covered by Windsor:

1. Send clients an online preview (00:11)
2. Better tones with camera calibration (02:23)
3. Change preset strength/opacity (03:17)
4. Targeted adjustment tool (05:18)
5. Automatically match exposures with different settings (06:25)
6. Faster image rating and selection (07:29)
7. Individual slider automation (08:13)
8. Edit local adjustment tools (08:29)
9. Bigger sliders (09:24)
10. Precision editing view (09:50)

You can find more of Windsor’s videos by subscribing to his YouTube channel.

via PetaPixel https://ift.tt/2JCxZKO

All the Ways Smartphone Cameras Have Improved Over the Years

Year after year, smartphone cameras have become more capable, more versatile, and more of a reason to leave your DSLR at home. So what are the tech innovations that have made the Pixel 2, the iPhone X, the Galaxy S9 and others such good photo takers compared to that old iPhone 6 or Galaxy S5?

Obviously, in the technical aspects of photography and cameras can get very nuanced. But in broad strokes, here’s a look at the ways some key technologies have improved over the years to make you ‘grams sharper and you snaps brighter.

More megapixels

At the core of the camera spec is the number of megapixels i captures—simply put, the resolution of the image that gets captured. It was by no means the first smartphone with a camera, but for comparison purposes the original 2007 iPhone came rocking a 2-megapixel rear camera with a fixed focus, capable of capturing images 1600 x 1200 pixels in size. Today’s Galaxy S9 and iPhone X have 12-megapixel cameras.

Advertisement

In the early days of smartphone cameras, megapixels were the yardstick that these components were measured by: More megapixels meant a better camera, generally speaking. But that isn’t necessarily true now and it wasn’t necessarily true then, because there are a whole host of other factors that affect image quality, as you can see from the extensive list below.

The problem with cramming more pixels into the same-sized sensor is the pixels get smaller, and let in less light. Remember the HTC UltraPixels introduced in 2013? That was an attempt to reduce megapixels, increase pixel size, and therefore capture more light (and therefore detail) as the camera shutter flashed open for a split second. HTC was on to something, because today the megapixel race has all but ended, with smartphone makers making improvements elsewhere instead.

Bigger sensors

It is a truth universally acknowledged that the bigger the image sensor in a camera, the better the end result (essentially, it lets the camera capture more light and more color detail). With any camera, you’re relying on several components working well together, but the image sensor is a crucial one.

Advertisement

It’s a shame then that there’s not much room inside smartphones—mobile camera sensors tend to be between 1/2.3 and 1/3 inches, much smaller than those inside DSLRs and even quality point-and-shoot cameras, though manufacturers are often coy about the specs in this regard. In fact, sensor size hasn’t changed much over the years in smartphone photography, because of those physical limitations, and it’s usually been in other areas where improvements have been made.

You’ll have a hard time digging down into any phone’s specs to find the image sensor size for the camera advertised, but the Nexus 6P was an exception—its 1/2.3-inch sensor is on the larger end of the scale, particularly for 2015, though sensor size alone isn’t a spec where modern-day devices are all that much better than phones of yesteryear. Note too the 6P’s 1.55 μm (micrometer) pixel size, larger than the 1.4 μm pixels in the Pixel 2, with a 1/2.6-inch sensor.

And of course for all the cameras that don’t advertise their sensor size, enterprising teardown artists do the work, and usually reveal that what we’re working with is teensy.

Wider apertures

GIF

On smartphone cameras as well as regular cameras, aperture controls the amount of light that gets to the image sensor. In a regular camera, aperture is manipulated to optimize for lighting conditions, blur, and desired depth of field, but in the world of smartphone cameras, in which optics are severely constrained, phone makers tend to optimize for having the widest aperture possible. This allows cameras to capture lots of light in all of those dark settings in which we all love to take photos, while keeping the shutter speed quick enough that your photo doesn’t come out blury. (Super-wide apertures have their downsides, but we’ll set them aside for now.)

Advertisement

Aperture size is measured in f-stops, and the smaller the f-stop, the wider the aperture (or opening). Last year the LG V30 camera set a new high watermark with an f/1.6 aperture, since surpassed by the dual aperture tech on the Samsung Galaxy S9, which lets you switch between f/1.5 and f/2.4 apertures, depending on what you’re trying to achieve with your pictures. You can get a great close-up look at the mechanism in this JerryRigEverything video.

Wider apertures have been made possible through the years as lens manufacturing quality has increased—something that’s of paramount importance if you’re letting more light in and want to keep a sharp, focused picture.

Better flash

Maybe not as important as some other components, but the on-board camera flash has made strides in the years that smartphones have been with us. Older phones, particularly Nokia and Sony models, made use of Xenon flash—very bright, but bulky and power-hungry too.

Advertisement

Today, phones use LED or dual-LED flash to produce a more subtle effect.. In the case of dual-LED, two LEDs are used with slightly different color temperatures, theoretically producing an end result with a better balance of colors that isn’t completely unnatural. Look closely at the flash on the back of your phone and you may well see the two tiny bulbs.

The most recent iPhones include even more improvements, and show how various smartphone camera specs work together to produce better results than the previous generation. As well as introducing quad-LED flash in 2016, the 2017 models debuted a feature called Slow Sync: It keeps the shutter open longer to capture more light and reduce the light needed from the flash, which can flash less brightly for less time.


Faster focus

Maybe you’ve never thought much about the focus on your smartphone’s camera if you’re not shooting sports or wildlife, but it’s pretty significant in the overall quality of your shot. It works by moving the camera lens on tiny motors to make the object of your photo nice and clear, but a host of other hardware and software factors are at play—and down the years, phone autofocus has become much more accurate, and much faster.

Advertisement

Before 2015, phone cameras focused solely based on the contrast they could detect in a scene. Starting with the Galaxy S5 and iPhone 6, phase detection was added, built right into the sensor: It uses the information coming in from both sides of the lens to calculate where the perfect focus is (where the points of light should meet). It’s faster than the standard contrast detection method, but it’s still not great in low light.

Enter more smartphone camera tricks. The dual pixels used on the most recent Galaxy phones, for example, turn every pixel into a little phase detection system, improving performance in darker scenes. For its Pixel phones, Google went with a time-of-flight infrared laser to measure distances quickly in any lighting situation. Again, it shows manufacturers getting creative, and in different ways, to improve photos taken on mobile.

Optical image stabilization

GIF

Optical image stabilization is more important than you might think: It doesn’t just keep your shaky videos steady, it also means that when you’re taking a photo, the shutter can stay open for longer without any blur, and again that’s crucial in terms of collecting light. In other words, your phone camera isn’t only relying on image stabilization when it’s shooting sports.

Advertisement

On the most basic level, optical image stabilization uses floating lens and miniature electromagnetic motors to move them. As the technology has become more advanced, phones have become better able to incorporate other data (from the gyroscope, for example), to further factor out shakiness. In fact there’s a whole host of different ways that manufacturers do this, both mechanical and non-mechanical.

OIS was actually cut from the original Pixel in favor of software adjustments, though it did appear in the Pixel 2. It’s also one of the small differences between the dual cameras on the iPhone 8 Plus and the iPhone X—the more expensive handset has OIS on both cameras, not just one. It’s a tech that has been refined, rather than revolutionized, in the time that smartphones have been around.

Dual cameras

What do you do when you can’t increase the size of your camera lens or your image sensor, because your components need to be as compact as possible? You add an extra camera to the mix. This is an approach now being adopted by phone makers across the board, with the LG G5 and the Huawei P9 the first to try it in the modern era. Two rear cameras had previously been seen on the HTC M8 and even before that, though they weren’t used in tandem as they are now.

Advertisement

The key benefit is clearly are more data for the camera to work with, whether that’s more data on color or contrast or being able to make use of a lens with a wider angle. All the restrictions we’ve talked about above can be overcome to some extent if you add another sensor and lens set to the mix. Of course, as phones have become more powerful, they’ve also become better able to crunch the information coming in from two cameras simultaneously.

Use a telephoto lens for the secondary camera and you can suddenly get 2x optical zoom, as Apple did with the iPhone 7 Plus. Huawei phones, like the Mate 10 Pro, have a monochrome sensor behind the secondary camera, used to gather extra brightness and contrast information. Two cameras also make it easier to assess depth in a scene, because they have slightly differing perspectives—and that opens up the possibility of the blurred bokeh effect that’s available just about everywhere now.

Improved processing

Finally, some of the biggest leaps forward in smartphone camera quality have come not through better optics, but through better software processing made possible by more powerful phones—as is the case with the Pixel 2 and the smart processing chip it has on board and which is now available to other apps.

Advertisement

One of the benefits you can see on a Pixel 2 phone is the way HDR effects can be calculated and applied in real-time as you frame your shot—if you’ve owned a smartphone for a while, you might remember the way HDR used to take a few seconds to process, and only then after you’d snapped the photo. Slowly but surely, processing power and algorithms are overtaking the physical limitations of the smartphone camera.

Another key area this affects is noise reduction, cleaning up the areas where the phone camera just can’t match a full-sized DSLR in terms of the light it can capture. Improved processing is also evident in something like Portrait Lighting, now available on the 2018 iPhones: using software smarts and in this case the grunt of the A11 Bionic chip to match a professional camera setup.

via Lifehacker https://ift.tt/2GBQbXe