News

These Cool Android 12 Features To Come In iOS 16

Last week, Google officially released Android 12 to the public. Although only owners of compatible Google Pixel smartphones can install it, for now, that didn’t stop us from finding out about all the new features in the update. Most of them, as usual, did not surprise us, because they have already been presented in alternative wrappers, and quite a long time ago. However, there are some innovations that would be a good idea to adopt, not only for Android smartphone manufacturers but for Apple as well.

Android 12 design

Android 12 design
Material design! You would look good on iOS too.

Apple last year already started lifting the restrictions on iOS customisation by allowing users to make changes to the desktop layout. Admittedly, that was about it. But it would have been much cooler if Apple had allowed app icons to be arranged in any order, rather than arranging them automatically. This would have allowed users to customize the desktop just the way they want it, changing not only the look of the OS but the practical intuitiveness as well. But this is a very basic change, needed only partially.

A much more important innovation, in my opinion, would be to apply the Material You-type design concept from Android 12 to iOS. Its peculiarity lies in the modification of the colour scheme of the OS, taking into account the shade of the desktop picture. Despite the perceived awkwardness of this design, it actually looks very harmonious. And since Safari already has roughly the same interface mechanism that colours the address bar in website tones, it wouldn’t be unusual to extend it to the entire OS.

What the orange dot means on the iPhone

What the orange dot means on the iPhone
Such an indicator in the upper right corner on iOS is now completely uninformative.

Now iOS already has an indication to indicate camera or microphone activity. However, these are only represented as an orange or green dot that lights up in the top right corner of the screen – without any additional explanation. In other words, it would be difficult for an untrained user to understand what they mean. In Android 12, Google has gone further and arranged explanations for users.

First, the indicators themselves are slightly larger than in iOS, and they have silhouettes of a camera or microphone, indicating which of the system mechanisms is currently working. Secondly, the indicators are clickable for even more convenience. This means that you can always click on the bright spot on the screen and it will take you to an explanatory section that explains why these lights are there in the first place.

Configure the iPhone control point

Customize iPhone Control Center
Control Center on Android 12 is much more convenient than on iOS.

The control panel on iOS is implemented quite conveniently, but it was still not without its flaws. In my opinion, it lacks the breadth of settings. For example, it is not possible to quickly switch between 5G, LTE or 3G. It is not possible to disable one of the SIM cards. It is not possible to set up Apple Pay. In general, there are still quite a few restrictions.

It was about the same in Android, but Android 12 has fixed these shortcomings. Not only have the Google developers increased the size of the icons themselves, so that they are easier to hit, which is very relevant for large screens, but also expanded the range of available settings. Users can now access and change advanced settings to suit their preferences.

The auto-rotate screen on iPhone

The auto-rotate screen on iPhone
The smart auto-rotate screen should have appeared on iOS first.

Android 12 has introduced a very cool feature that Apple should have come up with. I’m talking about the smart auto-rotate screen, which not only works based on the accelerometer but also takes into account data from the front camera. Thanks to this, if you’re lying down, the system won’t flip the screen, because it will see that your face is parallel to the smartphone.

This could have been implemented even more easily on iOS. Instead of using the front-facing camera, as Google did, Apple could have used Face ID, which would have detected the position of the user’s head with greater accuracy. Especially since the True Depth sensors already track gaze almost continuously, determining whether or not to turn off the screen if the user, for example, looks somewhere past.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *