Call it generic artificial intelligence, ChatGPT either google bard which is Google’s own, but I want you to AI completely changes the experience of using Android mobiles. Something like what Microsoft recently raised for Windows 11. Something that, except for surprise, will not happen with Android 14.
It is essentially about flooding the system with artificial intelligence and getting an assistant (hello Google Assistant) to help us in absolutely anything we want to do or doubt that we have to solve. Small big changes that, I am convinced, would help especially less advanced users.
Three fundamental pillars: system, integration with apps and answers to doubts
The aforementioned announcement made by Microsoft, and which you can see visually in the previous video, focuses especially on get shortcuts to change system settings. While it is true that operating a computer can be more complex than that of a mobile, I see no reason why this cannot be useful on smartphones.
I am aware that Android integrates a search engine in the settings that is also present in each and every one of its customization layers (Xiaomi’s MIUI, Samsung’s One UI, OPPO’s ColorOS, etc.). However, This search engine is not always as resolving as I would like. Not at least when it comes to looking for something very specific and that, if you don’t use the exact name of the setting, it won’t locate it.
I don’t want the AI to invent new features and ask that the phone suddenly be able to do things it can’t, but I do want it to make it easy to access everything we already have. Thus, the first function that I would like to see is to have a quick access that, I insist that it can be Google Bard, Google Assistant or whatever Google wants to call it, but that When asked to activate a certain setting, it does not understand what we mean and shows us a quick shortcut to activate and/or configure it.
Knowing how to integrate native AI with third-party apps is a challenge that I would like Google to join
Another aspect that Microsoft showed is the integration with apps. And this, seeing it on Android, would be fantastic. The key is in not only integrate with Google’s own apps such as Gmail, Maps and company, but also those of third parties. The clearest example is once again in what Windows promises with apps like Spotify.
Other examples, that we want to see a series of crimes and not only does the assistant offer us recommendations, but that it does so based on the streaming apps that we have with quick access to view them. Or that we want to send a certain formal email and not only a text is proposed to us, but the email app is also opened directly to send it.
I also imagine get higher productivity thanks to the AI with actions as interesting as being in a WhatsApp chat with a friend, let’s call him Paco, and with whom we are arranging dinner. It would be science fiction, but doable, to allow access to the AI and issue a request that says “remind me of this” and automatically be able to discern the topic of conversation and add to our calendar the day, time and place of the “ Dinner with Paco”. Obviously, always based on the data that has been given in that conversation.
The third pillar is that of answers to doubts. and this is at the end the most feasible AI integration for Google. In fact, with Bard we already have it, so it would only be to integrate it into a more accessible chat. And this is where the numerous requests that we can already ask a chatbot come in. From knowing details of the Second World War to knowing the recipe to cook a certain dish, illustrating it with images and/or videos.
And to avoid getting false information, that these results are also based on those that the search engine is able to obtain, feeding the information with links of interest to websites where the information has been collected. Something similar to what Google I/O 2023 already presented in the past with the ‘Supercharging Search’.
As a bonus track, I would like you to also this AI was the most accessible so you don’t have to go around in circles to open it. Here you can implement shortcuts already seen for Assistant or for other assistants that integrate manufacturers such as Samsung: a physical button on the device, a gesture on the screen that opens the assistant and that, of course, can also be activated by voice command.
With what Google has already announced and seeing the importance of AI, it is more than expected that Android will pick up many functions of this type. I don’t know if all of them, I don’t know if they are better. What is clear is that seeing how other operating systems such as Windows are advancing here, Google cannot be left behind with Android.
Cover Image | Pexels | wallpaperhub | Mockups Studio
In Xataka Android | Android will also have its official ChatGPT app, we leave you four alternatives while we wait