LG’s announcements at MWC 2018 have been, to put it bluntly, underwhelming. While a refresh of their main flagship line, the LG G series, is always present at LG’s press conferences, they opted for skipping the announcement of the LG G7 and instead went for a refresh of the LG V30, dubbed the LG V30S ThinQ. This is likely related to LG supposedly shelving current development of the LG G7 and beginning things from scratch. But a leaked video has surfaced showcasing the unreleased phone, so it’s possible they’re pretty far along in the process.
Israeli technology site ynet apparently got into a private LG event to get hands-on with a demo unit of the unreleased device, which seems to be heavily influenced this time around by Apple’s latest smartphone, the iPhone X. If the video is accurate, the LG G7 will be featuring a notch on top of the display, which houses the earpiece and the front-facing camera. These “notched” displays seem to be the trend going forward with most major players, as screen bezels keep getting smaller and smaller.
Going into the specifications, the LG G7 would be featuring a 6-inch POLED display with a 3120×1440 resolution, translating into a 19.5:9 aspect ratio—the same aspect ratio found in the latest Apple flagship. Internally we’re dealing with a typical 2018 flagship phone: It’s powered by a Snapdragon 845 processor and 64 GB of storage with 4 GB of RAM or 128 GB of storage with 6 GB of RAM configurations. Other specifications include a 3,000 mAh battery, a dual rear camera mount with a regular and wide-angle lens, and quite possibly the same AI features found in the recently announced LG V30S ThinQ.
It’s not clear yet whether this is the previously scrapped model of the LG G7 or the new, rebuilt one that’s possibly going to be announced, but given that it’s being shown in a private event we’re leaning towards the latter. Of course, this could be an early prototype and doesn’t have to necessarily resemble the final LG G7, but only time will tell whether this particular phone will make it to store shelves.
Biometric authentication may not be as secure as pins or passwords, but its convenience is a big selling point for many consumers. The extremely quick fingerprint scanner on the OnePlus flagships has been praised almost universally, but lately companies have been gravitating towards facial recognition technology as an alternative. For instance, there’s the OnePlus 5T and the Honor 7X with their respective takes on a Face Unlock feature. Samsung phones also have facial recognition for unlocking their devices, but the biometric authentication technology the company is most proud of is its iris scanner. Now, it appears that iris scanners may be coming to more Android phones in the future, as official support for it is being added to Android.
For those of us without a Samsung Galaxy flagship, there aren’t very many options when it comes to a smartphone with an iris scanner. In fact, there’s actually only a single option, and the phone isn’t even yet available for sale. An obscure smartphone called the BitVault that is aimed at cryptocurrency enthusiasts.
This smartphone, along with an unannounced smartphone from a Japanese smartphone OEM, are the only non-Samsung Galaxy devices that I’m aware of that offer iris scanning. The chip that powers these phones’ iris scanners is the FPC ActiveIRIS by Fingerprints.
FPC ActiveIRIS. Iris Recognition for Smartphones. Source: FPC.
You may have never heard of this company, but you have most likely used a smartphone that incorporates their technology. Some of the smartphones that use fingerprint scanners from FPC include the Google Pixel, the Honor 8, and the Huawei Mate 9 Pro. Their fingerprint sensors are found on many other devices, including several from Xiaomi, so it’s safe to say that FPC is one of the leading vendors in selling the biometric authentication technology found in smartphones.
FPC Fingerprint Scanners on the Home Button, Rear, and Side of the Device. Source: FPC.
So why is this company important? It’s because several of their engineers have been working on incorporating native support for biometric iris scanners in Android. There are several commits here, all of which should be looked at together to get a good picture of what’s going on.
The inclusion of a HAL interface will standardize how the Android framework will communicate with Iris scanners. This means that products from multiple vendors, not just from FPC themselves, will be able to function on Android. Most importantly, this also opens up the ability for AOSP-based ROMs to function generically with Iris scanning hardware. For instance, the Project Treble GSIs rely on this in order for basic fingerprint scanner functionality to work out of the box, so without this, the new Exynos Samsung Galaxy S9 and Galaxy S9+ will be unable to use the Iris scanner on an AOSP ROM.
The SELinux policies for the Iris scanners are wholly uninteresting for end users, but they’re there if you want to take a look at it. The inclusion of the base Iris feature in Android will allow for apps to detect if the device has an Iris scanner in place. Finally, the inclusion of the Iris framework is what will actually allow for third-party apps to utilize the Iris scanner for authentication in the future. Here are the relevant strings:
Iris Scanner in Framework
<string name="permlab_manageIris">manage iris hardware</string>
<!-- Description of an application permission, listed so the user can choose whether they want to allow the application to do this. -->
<string name="permdesc_manageIris">Allows the app to invoke methods to add and delete iris templates for use.</string>
<!-- Title of an application permission, listed so the user can choose whether they want to allow the application to do this. -->
<string name="permlab_useIris">use iris hardware</string>
<!-- Description of an application permission, listed so the user can choose whether they want to allow the application to do this. -->
<string name="permdesc_useIris">Allows the app to use iris hardware for authentication</string>
<!-- Message shown during iris acquisision when the iris cannot be recognized -->
<string name="iris_acquired_insufficient">Couldn\'t process iris. Please try again.</string>
<!-- Message shown during iris acquisision when the iris image is too bright -->
<string name="iris_acquired_too_bright">Iris is too bright. Please try in low light.</string>
<!-- Message shown during iris acquisision when the iris image is too dark -->
<string name="iris_acquired_too_dark">Iris is too dark. Please uncover light source.</string>
<!-- Message shown during iris acquisision when the user is too close -->
<string name="iris_acquired_too_close">Move further.</string>
<!-- Message shown during iris acquisision when the user is too far -->
<string name="iris_acquired_too_far">Move closer.</string>
<!-- Message shown during iris acquisision when the user eyes closed-->
<string name="iris_acquired_eyes_closed">Open eyes.</string>
<!-- Message shown during iris acquisision when the user eyes partially obscured-->
<string name="iris_acquired_eyes_partially_obscured">Open eyes wider.</string>
<!-- Array containing custom messages shown during iris acquisision from vendor. Vendor is expected to add and translate these strings -->
<string-array name="iris_acquired_vendor">
</string-array>
<!-- Error message shown when the iris hardware can't be accessed -->
<string name="iris_error_hw_not_available">Iris hardware not available.</string>
<!-- Error message shown when the iris hardware has run out of room for storing iriss -->
<string name="iris_error_no_space">Iris can\'t be stored. Please remove an existing iris.</string>
<!-- Error message shown when the iris hardware timer has expired and the user needs to restart the operation. -->
<string name="iris_error_timeout">Iris time out reached. Try again.</string>
<!-- Generic error message shown when the iris operation (e.g. enrollment or authentication) is canceled. Generally not shown to the user-->
<string name="iris_error_canceled">Iris operation canceled.</string>
<!-- Generic error message shown when the iris operation fails because too many attempts have been made. -->
<string name="iris_error_lockout">Too many attempts. Try again later.</string>
<!-- Generic error message shown when the iris operation fails because strong authentication is required -->
<string name="iris_error_lockout_permanent">Too many attempts. Iris sensor disabled.</string>
<!-- Generic error message shown when the iris hardware can't recognize the iris -->
<string name="iris_error_unable_to_process">Try again.</string>
<!-- Template to be used to name enrolled irises by default. -->
<string name="iris_name_template">Iris <xliff:g id="irisId" example="1">%d</xliff:g></string>
<!-- Array containing custom error messages from vendor. Vendor is expected to add and translate these strings -->
<string-array name="iris_error_vendor">
</string-array>
<!-- Content description which should be used for the iris icon. -->
<string name="iris_icon_content_description">Iris icon</string>
<!-- Title of an application permission, listed so the user can choose whether they want to allow the application to do this. -->
In the Manifest of the Framework, the suggested permission titled “android.permission.USE_IRIS” has a protection level of “normal,” so third-party apps would indeed be able to request the permission and it would be up to the user to grant it.
Lastly, another commit adds support for iris identification in the keyguard. This is what will actually allow the user to scan their iris to dismiss the lock screen. According to the commit, iris authentication only occurs as soon as the screen turns on in order to reduce power consumption. Further, the Iris scanner can be disabled according to the Device Policy Manager if that authority (such as a workplace) deems the iris scanner an insecure method of authentication.
Something interesting going on in all of these commits is how, in many places, references to fingerprints in the Android framework are being genericized to refer to biometrics. This prepares Android for potentially additional methods of biometric authentication in the future, though it’s unclear what that may be.
I won’t bore you with the rest of the implementation details, so I’ll move on to discuss the significance of these commits. What this means for Android is that a future version of Android, likely Android P, will include native support for Iris scanning hardware. I say “likely” because the commits haven’t been merged yet—the changes are very lengthy, and could take a few weeks or even months to pass code review.
It’s very likely that it’ll make it in for Android P, however, and there are even hints of the Iris scanner framework code having P-specific changes in place (such as doing away with storing user-information in /data/system/users and instead re-locating them to a new /data/vendor directory, likely secondary to undisclosed Project Treble requirements).
Further, this does appear to be full support for Iris scanners, though this doesn’t mean that additional features won’t be added by other vendors (in fact, the comments explicitly mention that). The basic implementation is there, though, so we should expect to see future smartphones shipping with biometric Iris scanners. There is no evidence in these commits that the Google Pixel 3 will have such a feature, though, so don’t assume that any particular device will have an Iris scanner because of these changes.
Note: I did reach out to FPC for comment on these changes, but did not receive a response from them by the time of this article’s publication.
from xda-developers http://ift.tt/2F5Dkri
via IFTTT
Huawei has decided to skip over Mobile World Congress this year to hold an event of their own, and as previously announced, the Chinese giant is preparing to launch their P20 line of devices on March 27, in Paris, France. Leaks for these devices have been making appearances despite Huawei’s absence at MWC, to the point where we already know almost everything about the 3-device lineup—the P20 Pro, with a triple rear camera mount, will be joined by the P20 and the smaller P20 Lite. We’ve seen the regular P20 already, and we’re now getting to know more about its smaller sibling, the Huawei P20 Lite.
Source: Evan Blass // VentureBeat
The P20 Lite is decidedly a mid-range device, at least specifications-wise: It will allegedly be powered by the octa-core HiSilicon Kirin 659 SoC with 4 GB of RAM and 64 GB of storage; the same formula that has already been tried-and-tested with the Honor 7X. On the outside, though, it looks much more premium that your average mid-range device. Huawei is going all in with the iPhone X/Essential Phone notch trend, for better or worse. The display is also being upgraded to welcome the aforementioned notch. The 2250 x 1080 FHD+ panel is not quite 18:9, as the resolution actually translates to 18.72:9—way closer to 19:9, and slightly taller than the Galaxy S9/S9+ panel.
Turning over to the back, we have a glass back with a rear-mounted fingerprint scanner and a vertically-mounted dual lens setup resembling the trend initiated by the iPhone X. The Leica co-developed camera setup with 16 MP sensors should provide more than decent camera performance, especially given the mid-range specifications. Other hardware specifications include a 3,520 mAh battery that, given the switch from a metal build to a glass back, could be aided by wireless charging.
All in all, the P20 Lite is looking up to be pretty similar to its flagship siblings while cutting the right corners. No information regarding pricing has been provided yet, but we should know more about that during the announcement event on March.
Update 2/28/18: Google has published a blog post today confirming the changes. More details at the end of the article.
While some Android enthusiasts are speculating what dessert the next version of Android will be named after, there are some interesting developments going on behind-the-scenes. We’ve spotted a few note-worthy upcoming features in Android P, but a more recent discovery in the Android Open Source Project (AOSP) has proven far more interesting. According to these recent commits, applications may be restricted from accessing APIs that are undocumented in the Android SDK (such as APIs marked by the javadoc’s attribute @hide).
Why this matters
The Android Software Development Kit (SDK) provides developers with API libraries and tools that they need to test and build new Android applications. With each new release of Android comes a whole host of new APIs that are available to developers through the Android SDK. What APIs are available to an app depend on what compileSDKVersion the developer sets. That’s why Google’s new Play Store requirements are so significant—it will force applications to update and migrate to using newer APIs.
Google hosts documentation pages for each class and all of its methods that are available in each API level. These are the set of documented APIs that are available in the official Android SDK. You can browse the list of classes easily using an Android app such as the recently released Android SDK Search app by Android Engineer Jake Wharton.
However, not all APIs that are available in each Android release are documented by Google, or available in the official Android SDK. There are often useful APIs that are undocumented, but are nonetheless very useful. It isn’t recommended that developers build their apps using undocumented, or hidden, APIs, but many do so because there’s simply no alternative if they want to offer a certain feature. Developers that use hidden or undocumented APIs can put themselves at a competitive advantage as well, since they can offer features that their competitors—who stick to the APIs offered by the Android SDK—cannot.
While I cannot provide a list of apps that utilize undocumented APIs (developers probably don’t share which ones they use because it would give their competitors a leg up), the list is probably rather large. Thus, I would conclude that banning access to hidden APIs would be significant. Mark Murphy, founder of Commonsware, agrees:
I agree with the assessment that bulk-banning access to @hide-annotated items will be a big deal, if that comes to pass. Hopefully, few apps access these items as part of key functionality. However, I suspect that lots of name-brand apps use them on occasion, directly or through a library.
What is happening in Android P?
These upcoming changes were first noted by XDA Senior Recognized Developer rovo89, the developer of the Xposed Framework. He pointed out two commits to me, one of which has been merged, which introduces a new build tool called ‘hiddenapi.’ This tool modifies the access flags of all class members within a DEX file if their signatures appear on an input greylist or blacklist, and if so, the marked methods will be treated as internal APIs with restricted access. The other commit describes how the API blacklist works; it prevents access to boot class methods and fields marked by the aforementioned ‘hiddenapi’ that developers may access by static linking, reflection, and JNI.
According to rovo89, the end result of these two changes in Android P is the following:
If these commits get merged, it would mean that apps can no longer use/access hidden APIs, that is classes, methods and fields which are annotated with @hide in AOSP and therefore not part of the official SDK. This wouldn’t be a problem for Xposed modules as I could easily revert those commits or allow modules to also access these APIs. But there are many apps which take advantage of hidden APIs, and those would fail in the future.
Indeed, further commits show that this may be what Google is planning. This commit states the following:
While this particular commit wasn’t merged as it was abandoned in favor of 3 smaller commits, the commit message describes the purpose of these changes. Another set of commits show that Google will suggest alternatives to developers who seek to use non-public APIs:
However, there are often no alternatives to certain hidden APIs. We at XDA can speak from experience here as unfortunately this change may spell the end of some innovative apps, or it may require some big-name apps to reduce their functionality. This upcoming change seems similar in spirit to the recent crackdown on Accessibility Services (that was thankfully paused as Google evaluated innovative uses). While most apps that utilize undocumented APIs do so for benign reasons, there may be some apps that have misused them for nefarious purposes.
Because of this, Google may be locking down access to all hidden APIs in Android P in order to safeguard users from the few that abuse them. It’s hard to say just how much of an impact this may have on users, but if you are a developer considering looking through AOSP to find an innovative use of a hidden API, then you may want to reconsider.
Update: Google Confirms
In a blog post published today, February 28th, Google has confirmed these changes. Citing crash risks for users and subsequently forcing developers to roll out emergency fixes, Google states the company has been gradually shifting towards discouraging developers from accessing non-SDK interfaces. Starting with Android P, the restrictions will expand to cover the Java language interfaces of the SDK.
The company states that “some non-SDK methods and fields will be restricted,” though they did not elaborate on which ones would be restricted. Initially the restriction will focus on interfaces that are rarely used, and for a while the company will allow developers to continue to use non-SDK methods and fields where transitioning to an SDK method is technically challenging. However, eventually the restrictions will broaden, so developers of apps using non-SDK methods should transition as soon as possible in preparation for Android P. As for methods without an SDK alternative, Google is requesting developers to post on their bug tracker with more information.
The next developer preview, ostensibly arriving soon, will allow developers to test existing apps against the blacklist or greylist before the final release.
from xda-developers http://ift.tt/2DGETw6
via IFTTT
If there were any doubts that Samsung had lost its touch, the South Korean company quelled them at Mobile World Congress 2018… mostly.
On Sunday, the smartphone maker formally announced the Galaxy S9 and S9+, the newest phones in its storied Galaxy series. Both have lightning-fast processors in Samsung’s Exynos 9810 or Qualcomm’s Snapdragon 845 (depending on the model), industry-first variable f/1.5 + f/2.4 aperture rear cameras, professionally-tuned stereo speakers, and new software features such as AR Emoji, Samsung’s take on the Apple’s Animoji.
But while the Galaxy S9 and S9+ check every box imaginable, they lack the element of surprise.
There’s no beating around the bush: Samsung’s new standard-bearers are less revolutionary than evolutionary. They’re nearly identical to their predecessors in terms of design, right down to the curved edge-to-edge 18:9 displays, glossy back plates, and Gorilla Glass-shielded exteriors. And with the exception of new processors and speakers, not much has changed on the inside.
Passing judgment from afar isn’t exactly fair, though, so when Samsung extended XDA an invitation to try out the Galaxy S9 and S9+ for ourselves at its New York City venue, we eagerly accepted. Our impressions after an hour with both phones? Positive. Still, we can’t help but feel that while the Galaxy S9 and S9+ are faster, brighter, and louder than every other Galaxy smartphone that’s come before them, Samsung played it safe.
Design
The Galaxy S9 is distinguished from the Galaxy S9+ only by its screen size (it has a 5.8-inch display compared to the Galaxy S9+’s 6.2-inch display), dimensions (it measures 147.7 millimeters in length and 68.7 millimeters in width; the Galaxy S9+ is 158.0 millimeters long and 73.8 millimeters wide), and weight (it’s 26 grams lighter than the Galaxy S9+). It also lacks the Galaxy S9+’s secondary camera, and settles for 4GB of RAM instead of the S9+’s 6GB. Otherwise, the two phones are pretty much identical.
That’s especially true when you’re squinting at the two from a distance. It’s only when you hold them side-by-side that the differences become more apparent, albeit only slightly.
What was more striking to us, though, was just how similar the Galaxy S9 and S9+ feel and look like the Galaxy S8 and S8+. The pair isn’t a perfect analog for its outgoing forerunners, but most folks will have a tough time making out changes such as the ever-so-slimmer top and bottom bezels and subtler curve to the left and right of the screen.
Galaxy S8/S8+ fingerprint orientation on the left; Galaxy S9/S9+ fingerprint orientation on the right.
One thing they might notice is the fingerprint sensor, which was adjacent to the rear camera on the Galaxy S8 and S8+. It’s been moved beneath the sensor module (which is now oriented vertically as opposed to horizontally on the S8 and S8+) on the S9 and S9+, which is a welcome improvement. Swiping a fingertip across the sensor, which used to require shimmying your hand up the sides of the phone to reach a finger around the volume rocker (or the power button, if you’re left-handed), is a much less arduous task than it used to be. The sensor is now situated below the camera on the rear panel. No finger kinesthetics required.
Fingerprints, while we’re on the subject, are something of a given on the Samsung Galaxy S9 and S9+’s Gorilla Glass 5 front and back. The scanner’s tweaked placement might prevent wayward digits from smudging the phones’ camera lenses, but does little to shield the highly reflective cover from sweaty, oily skin. As with the Galaxy S8 and Galaxy S8+, you’re going to want to throw the S9/S9+ in a protective case or carry around a microfiber cloth to keep it spick and span.
Camera
The Galaxy S9 and S9+’s design may not be radically different from the Galaxy S8 and S8+’s design, but the cameras are where the phones really shine. In fact, they’re easily the highlight.
There was initially some confusion about whether the Galaxy S9 and S9+ are capable of 4K HDR video recording. It’s a feature of the Snapdragon 845’s imaging chip, and a Qualcomm press release on Monday, since edited, contained language suggesting Samsung’s flagships would be one of the first on the market to support it. Unfortunately, that’s not the case: A Samsung representative confirmed to XDA that there are no plans to support 4K HDR video recording on either the Galaxy S9 or S9+. That puts the phones at a disadvantage compared to Sony’s newly announced Xperia XZ2, which has the same chipset and does support 4K HDR recording.
The Galaxy S9 has a 8MP f/1.7 aperture autofocusing front-facing camera (1/3.6″ sensor size, 1.22µm pixel size, and 80-degree field of view) and a 12MP rear camera (1/2.55″ sensor size, 1.4µm pixel size, and 77-degree field of view), with the S9+ packing an additional 12MP telephoto lens (1/3.4″ sensor size, 1.0µm pixel size, 45-degree field of view) for “2x zoom”. The sensors have Super Speed Dual Pixel, a faster and more accurate version of Samsung’s Dual Pixel focusing technology, but they otherwise haven’t changed — they retain the Galaxy S8 and S8+’s optical image stabilization, LED flash, and phase detection autofocus.
But the aperture is a smartphone first. It’s mechanical. The Pro mode in the S9 and S9+’s camera app gives you two settings to choose from: f/1.5, a lower aperture better suited for low light conditions (think nighttime and dimly lit offices), and f/2.4, the default setting. (Alternatively, the app’s Automatic mode switches to the f/1.5 aperture when ambient lighting dips below 100 lux.) A tiny motor in the Galaxy S9/S9+’s camera module is responsible for the adjustment — it contracts (when set to f/2.4) or expands (when set to f/1.5) a ring around the sensor’s lens.
The switch between the two apertures is nearly instantaneous — a major plus. And when we compared results from the two aperture settings at the same ISO and shutter speed, the photos captured in f/1.5 aperture seemed a little bit brighter and crisper than their f/2.4 counterparts.
We tested the Galaxy S9+’s camera in the camera app’s Pro mode with the focus, shutter speed, and white balance set to “auto”, and theexposure set to “0.0”. We took four photos in two different locations around Samsung’s demo venue: one with the aperture set to f/2.4, and a second with the aperture set to f/1.5. Here are the results:
The Galaxy S9 and S9+’s other camera improvements take advantage of the image signal processors (ISP) in the Exynos 9810 and Snapdragon 845 (the Spectra 280) and dedicated DRAM. Snapping a photo on either phones triggers a burst shot of 12 images, which the ISPs divide into three sets of four, combine on a per-set basis, and generate a single picture. Samsung calls it multiframe noise reduction; previous-generation Galaxy smartphones combined just three images.
The resulting composite is much more vibrant, crisp, and clear than a one-shot picture. (That won’t come as a surprise to anyone who’s used the Google Camera’s HDR+ mode, which takes a similar approach.) Samsung says the Galaxy S9 and S9’s improvements translate to 30 percent less noise in low-light conditions — a claim we’ll have to put to the test at a later date. The photos we took with the Galaxy S9 and S9+ seemed sharp and colorful to our untrained eyes.
Samsung gave the selfie sensor some love, too. On the Galaxy S9 and S9+, the 8MP front camera can optionally blur the background of images while keeping the foreground in focus in Selfie focus mode, much like the bokeh effect on the Google Pixel 2 and Pixel 2XL. It’s accomplished entirely in software, and the results aren’t perfect — several of our test selfies, the outer edges of the subject’s face are a bit smudged where the algorithm blended the image.
On the video side of things, the Galaxy S9 and S9+ have a new trick up their sleeves: 960FPS recording. Taking a page from the Sony Xperia XZ Premium‘s playbook, the handsets can capture clips in what Samsung calls Super Slow Motion. Unlike Sony’s Xperia XZ2 and XZ2 Compact, which can record at 1080p resolution, they’re capped at 720p (the clips are captured in 0.2-second bursts and play back as six-second videos). But we have no complaints about the quality: The few clips we captured were razor sharp and buttery smooth. We especially liked the automatic capture feature, which triggers Super Slow Motion when an object enters an adjustable, predefined boundary in the camera’s viewfinder.
Another nifty tool is a GIF generator that turns Super Slow Motion videos into shareable images, with effects such as an Instagram-esque Loop, Swing, and Reverse. (You can save the resulting image as your wallpaper, if you so choose.) It’s sure to come in handy when your social medium of choice doesn’t support video.
Display
If you’re like most people, you’ll spend a majority of time staring at the Galaxy S9 and S9+’s screen — not their back covers. Both phones have 2960×1440 Quad HD+ Super AMOLED displays with 18.5:9 aspect ratios (570 pixels per inch on the Galaxy S9; 529 ppi on the Galaxy S9+), and Samsung says they’re the “brightest ever” on a Galaxy series smartphone (they both reach 700 nits, or 15% higher than the Galaxy S8 series’ maximum).
That may be so, but the overhead lighting in Samsung’s demo space made it tough to judge the difference with the naked eye. Unfortunately, we didn’t have a brightness tester and were instructed not to take the phones outside, where direct sunlight might have made it easier to judge the improvements (and/or trigger high brightness mode). Suffice it to say that the Galaxy S9 and S9+’s panels are just as colorful and vibrant as they are on the Galaxy S8 and S8+, if not more so.
If the default, slightly oversaturated color palette isn’t to your liking, there are four to choose from:
Adaptive Display, the default option
AMOLED Cinema, which uses DCI-P3, the standard wide color space common in 4K HDR TVs
AMOLED Photo, which uses the Adobe RGB color gamut
Basic Screen Mode, which uses the sRGB/Rec. 709 color space.
Each has their advantages and disadvantages, with the AMOLED Cinema and Basic modes producing flatter but ostensibly more accurate colors than the two alternatives. It’s ultimately a matter of personal preference.
It’s worth mentioning that the Galaxy S9 and S9+ are certified by the UHD Alliance for Mobile HDR Premiumcontent (thanks in part to support for DCI-P3). The nuances of HDR are a little complicated, but in essence, HDR videos and video games boast higher contrast and brightness than non-HDR media, contributing to a picture with more accurate colors overall.
It’s not just HDR content that benefits — according to a Samsung representative, the S9 and S9+ have Samsung’s Video Enhancer feature, a carryover from the S7 and S8 that boosts the brightness and color contrast of streaming and local video.
Samsung’s words rang true in our limited time with the Galaxy S9 and S9+. The HDR YouTube videos we watched were richly rendered on the phones’ screens, with the AMOLED screens’ deep blacks highlighting the bright reds, yellows, and greens.
Iris scanner
The Galaxy S8 and S8+ shipped with an iris scanner. It worked, but somewhat inconsistently in certain lighting conditions — especially if you wore color contacts or sunglasses, or held your phone beyond the recommended distance from your eyes. The iris scanner is present and accounted for in the Galaxy S9 and S9+, but with a fallback this time: facial identification.
A new feature called Intelligent Scan uses both the iris scanner and the front-facing camera to secure the phones. In practice, when you tap the power button, both sensors start scanning your face for matches. As soon as there’s a positive ID, it’s open sesame — you’re greeted with the home screen.
Audio
A great screen is nothing without great speakers to match, and the Galaxy S9 and S9+ are Samsung’s strongest showing yet in that regard. The down-firing, AKG Acoustics-tuned stereo speakers easily clear the low bar set by the Galaxy S8 and S8+. They’re noticeably louder (40 percent louder, Samsung says), and they’re capable of delivering a “simulated surround sound experience” thanks to Dolby’s Atmos 3D technology. (Samsung’s venue wasn’t particularly conducive to testing this.)
A dearth of supported content makes Dolby Atmos less of a value-add than it otherwise might be, but a Samsung spokesperson said that Atmos-supported videos and movies will come to the Netflix on smartphones later this year. Mum’s the word on the number and date.
Don’t expect Galaxy S9 and S9+’s speakers to blow you away, though. They might sound better than last year’s models, but they’re still too tinny and boomy to stand in for a decent boombox or Bluetooth speaker.
AR Emoji
Apple’s Animoji, which tap the iPhone X’s depth-sensing Face ID camera for goofy animated iMessages, have achieved something of a cult following. It’s enough to have caught Samsung’s (and Asus’s) attention: The Galaxy S9 and S9+ ship with AR Emoji, a face-mapped camera feature that uses the phones’ front-facing sensor to mimic your mouth, eyebrow, and head movements on a humanoid caricature.
They’re easy to get up and running: On-screen instructions have you stare head-on at the camera and select your gender, and the camera app does the rest, analyzing more than 100 points on your face to render a cartoon version of you — replete with hair, eyebrows, customizable clothing, and a disproportionately small body.
A mini-me isn’t the only AR Emoji on offer. Samsung partnered with Disney to bring 3D-rendered versions of Mickey Mouse, Minnie, and characters from Pixar’s Incredibles.
Whichever model you choose, the camera app automatically generates 18 animated AR Emoji stickers in a shareable format (MP4). There’s a host of additional masks, filters, and accessories to choose from. And unlike Apple’s Animoji, which can’t be exported from iMessage, AR Emoji work in any app — be it a messaging service like WhatsApp, a social network like Facebook, or a plain old email.
AR Emoji crashed and burned during Samsung’s press event in Barcelona on Sunday, and they were a little stiff in our experience, too. The single front camera struggles to track head movements and mouth movements beyond a fairly narrow field of view, and if you don’t hold the S9/S9+ close to your face when you’re creating an AR Emoji, the resulting animation can be really janky.
Suffice it to say, AR Emoji aren’t quite as endearing as the 1-to-1-tracked, cute and cuddly characters on the iPhone X.
Bixby improvements
Digital lipstick, courtesy Bixby Vision.
Bixby, Samsung’s homegrown digital assistant, makes a return on the Galaxy S9 and S9+. The latest incarnation can be launched via the Galaxy S9 and S9+’s dedicated Bixby button (below the volume rocker on the left-hand side): a single press pulls up Bixby Home, a collection of cards that contain timely information. You’ll see the weather report, a preview of your commute (based on your location and proximity to your saved work/home address), upcoming alarms, and health information (like you step count) from S Health.
None of that’s new, but Bixby Vision, Bixby’s machine vision feature, is improved in a few key ways. An augmented reality feature overlays shades of lipstick, eyeshadow, and other makeup on your face, letting you “try on” beauty products before you purchase them through Sephora and Cover Girl. Bixby Vision now supports real-time translation a la Google Translate. And if you point Bixby Vision’s viewfinder at food, it’ll serve up the estimated calorie count and other nutritional data.
The “digital makeup” feature worked well in our testing (maybe too well), but we didn’t have an opportunity to try Bixby’s new food recognition or real-time translation features.
It’s worth noting that after the Galaxy S9 and S9+ ships in March, Bixby will gain additional features. In August, Bixby 2.0, which launched in public beta in December, will roll out to phones, Samsung mobile chief DJ Koh told members of the press at MWC 2018. It’ll recognize multiple voices and integrate tightly with TVs, refrigerators, home appliances, and other connected appliances.
Performance
The Galaxy S9 and S9+, like other recent flagship Samsung phones before them, ship with one of two system-on-chip (SoC). This time around, it’s Samsung’s Exynos 9810 or Qualcomm’s Snapdragon 845.
It’s worth diving into the technical weeds to get a better sense of chips’ differences.
The Exynos 9810, the second SoC in the Exynos 9 series, is built on a 10nm FinFET process and adopts ARM’s DynamIQ architecture. It has four high-performance custom cores clocked up to 2.7GHz and four ARM Cortex-A55 cores clocked at 1.7GHz, and a wider pipeline with improved cache memory. Performance is substantially improved over the Exynos 8895 in the Galaxy S8 and S8+: Samsung says the Exynos 9810 is two times faster in terms of single-core performance and 40 percent faster in terms of multi-core performance.
The Exynos 9810 ships with the Mali-G72MP18 GPU, which has a slightly decreased core count compared to the Exynos 8895’s Mali-G71MP20, but improved per-core efficiency.
The chip’s Cat. 18 Gigabit modem supports download speeds over LTE up to 1.2Gbps thanks to 6X carrier aggregation (6CA), 4×4 MIMO, 256-QAM, and License-Assisted Access (eLAA), and it has neural network deep learning technologies that power Bixby’s image recognition features and 3D Emoji’s face-tracking filters. Finally, there’s a secure element that safeguards biometric data such as fingerprints, iris scans, and facial information.
The Qualcomm Snapdragon 845, which we recently benchmarked, is also built on a 10nm process and adopts ARM DynamIQ. It has eight custom Kryo cores: four Cortex-A75 “Gold” performance cores clocked up to 2.8GHz and four Cortex-A55 “Silver” efficiency cores clocked at 1.7GHz, which contribute to a 30 percent boost in overall performance and a 25 to 30 percent improvement in power-efficiency compared to the Snapdragon 835 in the Galaxy S8 and S8+.
On the visual processing side of things, the Snapdragon 845 packs the Adreno 630, Qualcomm’s latest GPU. It’s 30 percent faster and 30 percent more power-efficient than the Snapdragon 835’s Adreno 540, and it has 2.5 times the display throughput.
Diagram of the Snapdragon 845’s Hexagon DSP.
The Snapdragon 845’s other notable peripherals include the X20 modem, which supports Cat. 18 LTE download speeds up to 1.2Gbps, carrier aggregation, 4×4 MIMO, 256-QAM, and eLLA; the Hexagon DSP, a chip custom-designed for neural network workloads; and Qualcomm’s Secure Processing Unit, a secure element for biometric data.
Our Galaxy S9 and S9+ demo units had the Exynos 9810, and felt as swift and speedy as you’d expect. Switching between apps and juggling multiple tabs in Chrome was equally as breezy, likely thanks to the 6GB of RAM in the Galaxy S9+ and 4GB of RAM in the S9.
That said, we’re reluctant to jump to any conclusions about performance without time to run the phones through their paces (i.e., perform benchmarking tests and our in-house suite of scripts). Already, preliminary results have shown that the Exynos 9810 performs unpredictably in the Galaxy S9+, and in the interest of fairness, we’re reserving judgment until we’ve had a chance to thoroughly investigate Samsung’s claims.
We’ve also yet to test the Galaxy S9 and S9+’s battery life. They have the same capacities as the Galaxy S8 and S8+, respectively: 3,000mAh and 3,500mAh. (Both support wireless charging and Samsung’s Adaptive Fast Charging.) Samsung says the Galaxy S9 gets up to 14 hours of internet use on Wi-Fi, 11 hours on 3G, and 12 hours on 4G; 16 hours of video playback; and 22 hours of talk time. It says the Galaxy S9+ gets up to 15 hours on Wi-Fi, 13 hours on 3G, and 15 hours on 4G; 18 hours of video playback; and 25 hours of talk time.
And we haven’t tested the storage’s read and write speeds. The Galaxy S9 and S9+ ship with 64GB of internal memory (up to 256GB) and a microSD slot that supports cards up to 400GB.
Software
The Galaxy S9 and S9+ ship with Samsung Experience 9.0 atop Android Oreo. Both are Project Treble compatible which is great news for the modding community — in the future, we expect to see the Galaxy S9 and S9+ boot generic Android Open Source Project images in the future (but only for the Exynos models, which have unlockable bootloaders).
As far as Samsung Experience 9.0 is concerned, there isn’t much in the way of surprises. It began to roll out late last year as part of the Android Oreo beta to Galaxy S8 and S8+ participants in Samsung’s Beta Program, after which it launched more broadly in stable form. From what we can tell, Samsung Experience 9.0 on the S9 and S9+ is no different than publicly available version, save features like AR Emoji.
The new and improved Samsung Keyboard adds a Google-style toolbar to the top row with shortcuts, a theme switcher, and a GIF creator. And Edge Lighting, a staple of Samsung’s curved-screen devices that shows alerts, text scrolls, and other peripheral information on the phone’s sides, has been enhanced with more lighting effects.
The Samsung Experience 9.0 launcher implements support for Android Oreo’s Notification Dots and Adaptive Icons, and a new color picker that lets you tweak the appearance of folders. Additionally, the lock screen has a new clock widget and an adaptive coloring option that changes the lock screen color to match your phone’s background.
If you nab the new $150 DeX Dock with your Galaxy S9/S9+, you’ll benefit from the new, higher-resolution (2,560 x 1,400) display output (double the previous DeX Dock’s resolution). Samsung says that more than 40 partners are optimizing their Android apps for the DeX Dock interface, but alternatively, you can take advantage of Samsung’s Linux on Galaxy feature and install a full-blown Linux distribution.
Conclusion
If it wasn’t obvious from the get-go, Samsung isn’t out to break new ground or shake up the smartphone industry with the Galaxy S9 and S9+. That much became clear in the hour we spent getting a handle on AR Emoji, putting variable aperture setting to the test, and blasting sound through the stereo speakers. The S9 and S9+ are iterative in every sense of the word: The new processors are on a par with other flagship devices announced for this year; the S9+’s upgraded RAM and secondary rear sensor bring it up to speed with the competition; and the down-firing speakers merely improve on the S8 and S8+’s disappointing sound.
But iteration isn’t necessarily a bad thing. In fact, a Samsung rep told me that the company’s well aware that most soon-to-be Galaxy S9 and S9+ owners will be upgrading from a Galaxy S7 or S7+. For them, the phones are a giant technological leap forward.
For current S8 and S8+ owners, though, or folks with a relatively new flagship such as the OnePlus 5T or LG V30, the incremental differences make the price tags hard to justify. At $720 and $840 for the S9 and S9+, respectively, they’re easily two of the most expensive phones on the market. Trade-in deals and monthly installment pricing help ease the burden a little, but no matter how you slice it, that’s a lot of moolah for variable aperture.
from xda-developers http://ift.tt/2HQSd2E
via IFTTT
DeX is a hardware accessory first sold by Samsung along with the Samsung Galaxy S8 and the Galaxy S8+. It consists of a dock station allowing the user to connect the Galaxy S8, Galaxy S8+ or the Galaxy Note 8 with a monitor, keyboard, and mouse to get access to a full desktop UI. The dock has a HDMI port, an Ethernet port, and two USB ports. DeX is simply an extension of Android Nougat’s multi-window to push optimized applications onto the connected display. There are obvious limitations to this as a lot of software needs to be updated to support Samsung’s DeX, but that’s changing.
Now, Samsung is launching Linux on Galaxy feature which is an app that enables the capability of running multiple GNU/Linux operating systems on a Samsung smartphone when connected to a DeX dock. There is already a way to run a GNU/Linux environment on any Android device, but it isn’t as sophisticated as Samsung’s implementation.
With DeX, you can have Ubuntu 16.04 or another distribution running on the DeX dock via connected peripherals. As these Linux distributions are made for a desktop-oriented UI, Linux on Galaxy is a perfect fit for DeX because DeX connects to your smartphone to a much larger display.
We can expect this feature to be popular among developers who will be able to now set up a fully functional development environment with all the advantages of GNU/Linux. Samsung is hoping to pull more users away from their laptops/desktops into committing to their ecosystem, though Linux on Galaxy is still experimental. If you are interested in signing up, you can do so here.
Update: Video
As pointed out by Omg! Ubuntu! (via AndroidPolice) Samsung recently uploaded a video demonstrating Linux programs such as Firefox, Thunderbird, Eclipse, and GIMP. You can check it out below.
Upgrading an existing Android device to a new version of Android can be a long and arduous process, according to Sony. Part of the issue revolves around waiting for vendors (like Qualcomm) to provide device makers (like Sony) with updated HAL source code or binaries in order to work with the new version of Android. Thanks to Project Treble, device makers can start work on the next Android version much more quickly, at least that’s the idea behind it.
We’ve talked ad nauseam about the potential benefits of Treble for custom ROM development, with many devices now capable of enjoying ROMs such as LineageOS 15.1, CarbonROM, and more on several Treble compatible devices. But there’s one question that has always lingered in the back of our minds–what happens when Android P rolls around? Will we be able to flash an Android P Generic System Image (GSI) on top of a device with an Android 8.1 Oreo vendor image? This is a question that nobody has been able to truly answer, since Android P source code is not available (and thus, an Android P GSI cannot be built), so some developers were skeptical of this ever happening.
However, a new commit suggests that Google is testing exactly that on the Google Pixel 2.
What is being shown here is that Google is updating the Vendor Test Suite (VTS) to allow for testing an Android P GSI with an Android 8.1 Oreo vendor image. The device that this is being tested on is the Google Pixel 2 (“wahoo device“). Google tests that this configuration does in fact boot, which is a requirement for passing the VTS.
What does this mean for us? Unfortunately, it’s hard to extrapolate. We can’t say this proves that any upcoming device launching with Android 8.1 Oreo (such as the Huawei P20 or Xiaomi Mi Max 3) will be able to boot up an Android P GSI out of the box, since we don’t have more information nor do we have an Android P system image to test with. At the very least, this shows that work is progressing nicely on Treble, and once Android P source code eventually drops, we can finally put these claims to test.
from xda-developers http://ift.tt/2BWzEtB
via IFTTT
Google Photos v3.15 has begun rolling-out to users on the Play Store, and as usual, Google has not published an official changelog. While live changes are few, we did an APK teardown of the app and found that Google is planning to allow users to like photos in shared albums. Also, the company may be planning to allow users to export Motion Photos as GIFs. Let’s take a look at the changes one-by-one:
An APK teardown can often predict features that may arrive in a future update of an application, but it is possible that any of the features we mention here may not make it in a future release. This is because these features are currently unimplemented in the live build and may be pulled at any time by Google in a future build.
Like photos in shared albums
<string name="photos_hearts_viewbinder_user_liked_a_photo">%s liked a photo</string>
<string name="photos_hearts_viewbinder_user_liked_a_video">%s liked a video</string>
The strings indicate that users will soon be able to “like” photos in shared albums. Shared albums are albums that are shared between different users. The “like” feature will be part of an upcoming “Favorites” feature that has showed up in strings for a long time. Just like the “Favorites” feature, the “like” feature hasn’t gone live yet.
These strings are most likely related to Motion Photos. As of now, Motion Photos can be exported as either still photos or as videos. Soon, users will be able to export them as GIFs as well. This feature is hardly groundbreaking, but when it goes live, users will be able to share Motion Photos in a standard file format, which will make it easier to share them.
Current export options for Motion Photos
Let us know in the comments if you spot anything new, and follow our APK Teardown tag for more articles like this!
from xda-developers http://ift.tt/2FEYH48
via IFTTT