All the Changes Coming to Your Instagram

At Facebook’s F8 conference on Tuesday, Mark Zuckerberg discussed how Instagram will be changing in the coming weeks. The upgrades include video chat, an Explore redesign, AR camera effects, and a bullying filter.

The new video chat feature will allow users to talk with one or more people. “You’re going to be able to just tap on a new camera icon in the top of any direct thread that you have and you’re going to be able to video chat one-on-one or with groups,” Zuckerberg explained. The chat screen can be minimized so users can use the feature while also scrolling through Instagram.

Instagram is also rolling out a new Explore design that features categories so users can choose different topic channels—like “animals,” “nail art,” and “slime”—rather than just seeing content that Instagram thinks you’ll like.

Instagram will start featuring the augmented reality effects platform that has been available on Facebook for about a year. Anyone will be able to build face filters and effects, and users can try out filters they see used by other accounts in their Stories feed.

But the brands will also be invading—third parties will be able to create custom filters. Zuckerberg announced that Instagram is debuting filter partnerships with Ariana Grande, BuzzFeed, Kylie Jenner, and the NBA, among others. Stories will also allow third-party integration with apps like GoPro and Spotify, so, for example, users will be able to share a song they’re listening to directly to their Stories feed from Spotify, instead of having to post a screenshot.

And to try to make Instagram a safer space, the company is launching a bullying filter. The new system will rely largely on machine learning to censors language perceived as harassment. Instagram says the new filter is on now for all users, but the setting can be turned off inside the app’s “Comment Controls” section.

via Gizmodo https://ift.tt/2FyzVRf

Advertisements

All the Ways Smartphone Cameras Have Improved Over the Years

Year after year, smartphone cameras have become more capable, more versatile, and more of a reason to leave your DSLR at home. So what are the tech innovations that have made the Pixel 2, the iPhone X, the Galaxy S9 and others such good photo takers compared to that old iPhone 6 or Galaxy S5?

Obviously, in the technical aspects of photography and cameras can get very nuanced. But in broad strokes, here’s a look at the ways some key technologies have improved over the years to make you ‘grams sharper and you snaps brighter.

More megapixels

At the core of the camera spec is the number of megapixels i captures—simply put, the resolution of the image that gets captured. It was by no means the first smartphone with a camera, but for comparison purposes the original 2007 iPhone came rocking a 2-megapixel rear camera with a fixed focus, capable of capturing images 1600 x 1200 pixels in size. Today’s Galaxy S9 and iPhone X have 12-megapixel cameras.

Advertisement

In the early days of smartphone cameras, megapixels were the yardstick that these components were measured by: More megapixels meant a better camera, generally speaking. But that isn’t necessarily true now and it wasn’t necessarily true then, because there are a whole host of other factors that affect image quality, as you can see from the extensive list below.

The problem with cramming more pixels into the same-sized sensor is the pixels get smaller, and let in less light. Remember the HTC UltraPixels introduced in 2013? That was an attempt to reduce megapixels, increase pixel size, and therefore capture more light (and therefore detail) as the camera shutter flashed open for a split second. HTC was on to something, because today the megapixel race has all but ended, with smartphone makers making improvements elsewhere instead.

Bigger sensors

It is a truth universally acknowledged that the bigger the image sensor in a camera, the better the end result (essentially, it lets the camera capture more light and more color detail). With any camera, you’re relying on several components working well together, but the image sensor is a crucial one.

Advertisement

It’s a shame then that there’s not much room inside smartphones—mobile camera sensors tend to be between 1/2.3 and 1/3 inches, much smaller than those inside DSLRs and even quality point-and-shoot cameras, though manufacturers are often coy about the specs in this regard. In fact, sensor size hasn’t changed much over the years in smartphone photography, because of those physical limitations, and it’s usually been in other areas where improvements have been made.

You’ll have a hard time digging down into any phone’s specs to find the image sensor size for the camera advertised, but the Nexus 6P was an exception—its 1/2.3-inch sensor is on the larger end of the scale, particularly for 2015, though sensor size alone isn’t a spec where modern-day devices are all that much better than phones of yesteryear. Note too the 6P’s 1.55 μm (micrometer) pixel size, larger than the 1.4 μm pixels in the Pixel 2, with a 1/2.6-inch sensor.

And of course for all the cameras that don’t advertise their sensor size, enterprising teardown artists do the work, and usually reveal that what we’re working with is teensy.

Wider apertures

GIF

On smartphone cameras as well as regular cameras, aperture controls the amount of light that gets to the image sensor. In a regular camera, aperture is manipulated to optimize for lighting conditions, blur, and desired depth of field, but in the world of smartphone cameras, in which optics are severely constrained, phone makers tend to optimize for having the widest aperture possible. This allows cameras to capture lots of light in all of those dark settings in which we all love to take photos, while keeping the shutter speed quick enough that your photo doesn’t come out blury. (Super-wide apertures have their downsides, but we’ll set them aside for now.)

Advertisement

Aperture size is measured in f-stops, and the smaller the f-stop, the wider the aperture (or opening). Last year the LG V30 camera set a new high watermark with an f/1.6 aperture, since surpassed by the dual aperture tech on the Samsung Galaxy S9, which lets you switch between f/1.5 and f/2.4 apertures, depending on what you’re trying to achieve with your pictures. You can get a great close-up look at the mechanism in this JerryRigEverything video.

Wider apertures have been made possible through the years as lens manufacturing quality has increased—something that’s of paramount importance if you’re letting more light in and want to keep a sharp, focused picture.

Better flash

Maybe not as important as some other components, but the on-board camera flash has made strides in the years that smartphones have been with us. Older phones, particularly Nokia and Sony models, made use of Xenon flash—very bright, but bulky and power-hungry too.

Advertisement

Today, phones use LED or dual-LED flash to produce a more subtle effect.. In the case of dual-LED, two LEDs are used with slightly different color temperatures, theoretically producing an end result with a better balance of colors that isn’t completely unnatural. Look closely at the flash on the back of your phone and you may well see the two tiny bulbs.

The most recent iPhones include even more improvements, and show how various smartphone camera specs work together to produce better results than the previous generation. As well as introducing quad-LED flash in 2016, the 2017 models debuted a feature called Slow Sync: It keeps the shutter open longer to capture more light and reduce the light needed from the flash, which can flash less brightly for less time.


Faster focus

Maybe you’ve never thought much about the focus on your smartphone’s camera if you’re not shooting sports or wildlife, but it’s pretty significant in the overall quality of your shot. It works by moving the camera lens on tiny motors to make the object of your photo nice and clear, but a host of other hardware and software factors are at play—and down the years, phone autofocus has become much more accurate, and much faster.

Advertisement

Before 2015, phone cameras focused solely based on the contrast they could detect in a scene. Starting with the Galaxy S5 and iPhone 6, phase detection was added, built right into the sensor: It uses the information coming in from both sides of the lens to calculate where the perfect focus is (where the points of light should meet). It’s faster than the standard contrast detection method, but it’s still not great in low light.

Enter more smartphone camera tricks. The dual pixels used on the most recent Galaxy phones, for example, turn every pixel into a little phase detection system, improving performance in darker scenes. For its Pixel phones, Google went with a time-of-flight infrared laser to measure distances quickly in any lighting situation. Again, it shows manufacturers getting creative, and in different ways, to improve photos taken on mobile.

Optical image stabilization

GIF

Optical image stabilization is more important than you might think: It doesn’t just keep your shaky videos steady, it also means that when you’re taking a photo, the shutter can stay open for longer without any blur, and again that’s crucial in terms of collecting light. In other words, your phone camera isn’t only relying on image stabilization when it’s shooting sports.

Advertisement

On the most basic level, optical image stabilization uses floating lens and miniature electromagnetic motors to move them. As the technology has become more advanced, phones have become better able to incorporate other data (from the gyroscope, for example), to further factor out shakiness. In fact there’s a whole host of different ways that manufacturers do this, both mechanical and non-mechanical.

OIS was actually cut from the original Pixel in favor of software adjustments, though it did appear in the Pixel 2. It’s also one of the small differences between the dual cameras on the iPhone 8 Plus and the iPhone X—the more expensive handset has OIS on both cameras, not just one. It’s a tech that has been refined, rather than revolutionized, in the time that smartphones have been around.

Dual cameras

What do you do when you can’t increase the size of your camera lens or your image sensor, because your components need to be as compact as possible? You add an extra camera to the mix. This is an approach now being adopted by phone makers across the board, with the LG G5 and the Huawei P9 the first to try it in the modern era. Two rear cameras had previously been seen on the HTC M8 and even before that, though they weren’t used in tandem as they are now.

Advertisement

The key benefit is clearly are more data for the camera to work with, whether that’s more data on color or contrast or being able to make use of a lens with a wider angle. All the restrictions we’ve talked about above can be overcome to some extent if you add another sensor and lens set to the mix. Of course, as phones have become more powerful, they’ve also become better able to crunch the information coming in from two cameras simultaneously.

Use a telephoto lens for the secondary camera and you can suddenly get 2x optical zoom, as Apple did with the iPhone 7 Plus. Huawei phones, like the Mate 10 Pro, have a monochrome sensor behind the secondary camera, used to gather extra brightness and contrast information. Two cameras also make it easier to assess depth in a scene, because they have slightly differing perspectives—and that opens up the possibility of the blurred bokeh effect that’s available just about everywhere now.

Improved processing

Finally, some of the biggest leaps forward in smartphone camera quality have come not through better optics, but through better software processing made possible by more powerful phones—as is the case with the Pixel 2 and the smart processing chip it has on board and which is now available to other apps.

Advertisement

One of the benefits you can see on a Pixel 2 phone is the way HDR effects can be calculated and applied in real-time as you frame your shot—if you’ve owned a smartphone for a while, you might remember the way HDR used to take a few seconds to process, and only then after you’d snapped the photo. Slowly but surely, processing power and algorithms are overtaking the physical limitations of the smartphone camera.

Another key area this affects is noise reduction, cleaning up the areas where the phone camera just can’t match a full-sized DSLR in terms of the light it can capture. Improved processing is also evident in something like Portrait Lighting, now available on the 2018 iPhones: using software smarts and in this case the grunt of the A11 Bionic chip to match a professional camera setup.

via Lifehacker https://ift.tt/2GBQbXe

How an extra lens can transform your iPhone’s photos

A few years ago, the hit Sundance film Tangerine was shot entirely on an iPhone 5S to surprisingly good results. And last year, Steven Soderbergh did the same the same thing while making Unsane, with his only camera being an iPhone 7 Plus.

While a lot more than camera gear goes into making those movies look the way they do, the fact that professional filmmakers are using cameras that many of us keep in our pockets is a pretty encouraging sign when it comes to what we should be able to produce.

I wanted to test out just what you can do to improve your phone’s photo- and video-taking abilities, so I tried out a few different things: some add-on lenses, a microphone, and a stabilizer. You can see the results in the video above.

One thing I was particularly interested in was add-on lenses, which fit on top of your phone’s existing lens to provide a different field of view and potentially even better image quality. The best known of these come from Moment, but Moment’s lenses are expensive — they’re around $90 to $100. So I also tried out some cheaper lenses from no-name companies on Amazon.

It turns out, you really do get what you pay for. Here’s some of the differences:

Left: Ztylus Switch 6 fisheye lens / Right: Moment 2nd gen superfish lens (photos taken a few hours apart)

Macro shot of tree notch. Left: Amir clip-on lens / Right: Moment 2nd gen macro lens

Left: Ztylus Switch 6 tele / Right: Moment 2nd gen tele

So yeah, if you’re serious about getting more out of your phone’s camera, those photos should make it obvious that it’s worth paying more — in all cases, Moment’s lenses were brighter, sharper, and nicer to look at. That said, if you’re just curious about what add-on lenses can do and want to explore some fun effects, I don’t think you’d be too upset with the cheap lenses, so long as you don’t spend much more than $10. The three-lens kit that I got was only $12, and I probably got $12 of fun out of it. I just wouldn’t shoot anything too important with them.

via The Verge – Circuit Breaker stories for @circuitbreaker IFTTT twitter https://ift.tt/2Gs83Qp

Google’s AI-powered Google Lens rolls out on iOS

 

Google’s AI-powered Google Lens rolls out on iOS

Google in a tweet on Thursday said its Google Lens visual search feature will roll out to iOS devices over the coming week as part of update to the company’s Google Photos app.

Announced in a post to the official Google Photos Twitter account, Lens started rolling out to iOS users today in the app’s latest version 3.15. The release is available for some users now and will be accessible to all comers within a week, Google said.

Building on artificial intelligence and computer vision technology developed in part for Google Goggles, Lens is a visual search tool that integrates Google Assistant assets to recognize objects in live and previously captured images.

Shown off at Mobile World Congress earlier this month, Lens can distinguish flower types, parse text from business cards, pull up restaurant reviews, create calendar entries and more.

For example, pointing a smartphone camera at a historical monument will trigger Assistant to retrieve details on the site, which appear in an onscreen overlay. Alternatively, aiming Lens at a business card or photograph can trigger a procedure that allows users to create a new phonebook contact.

Subsequent twitter posts today offer tips on how to best take advantage of Lens. Users can learn more about landmarks by tapping on the Lens icon, for example, or copy and take action on text seen in a photo. Text prompts can be used to navigate to websites, get directions, add events to a calendar, call phone numbers and more. Google also presents the option of pointing Lens at a book cover to find online reviews and synopses.

Initially available on Pixel devices, the feature made its way to Android devices through Google Assistant in the Google Photos app last week. At the time, Google said iOS support would be coming soon, but failed to detail a specific launch timeline.

Apple does not currently market visual search functionality in Siri, though the Photos app does boast image recognition features capable of distinguishing people, objects and locations. The iPhone maker also introduced image recognition capabilities into ARKit 1.5, allowing developers to build out features like interactive movie posters and book covers.

Google Photos is a free 212.1MB download from the iOS App Store.

via AppleInsider http://ift.tt/2HBJmkp