{"id":16506,"date":"2025-05-24T01:22:06","date_gmt":"2025-05-23T23:22:06","guid":{"rendered":"http:\/\/plus.maciejpiasecki.info\/index.php\/2025\/05\/24\/the-ultimate-google-i-o-2025-wrap-up-guide-all-killer-no-filler\/"},"modified":"2025-05-24T22:02:04","modified_gmt":"2025-05-24T20:02:04","slug":"the-ultimate-google-i-o-2025-wrap-up-guide-all-killer-no-filler","status":"publish","type":"post","link":"https:\/\/plus.maciejpiasecki.info\/index.php\/2025\/05\/24\/the-ultimate-google-i-o-2025-wrap-up-guide-all-killer-no-filler\/","title":{"rendered":"The Ultimate Google I\/O 2025 Wrap-Up Guide: All Killer, No Filler"},"content":{"rendered":"<p>The Google I\/O 2025 keynote is now over, and with it, Google announced a ton of new stuff today. All of it, unsurprisingly revolving around AI and all of Google\u2019s different AI tools and features. From New monthly subscription plans and their new names, to Gemini coming to Google TV and vehicles, AI is the complete focal point of Google I\/O this year. Of course, AI has been the center of attention at past events. That said, it\u2019s never really encompassed the entirety of Google\u2019s keynote speech.<\/p>\n<p>Putting this much focus on Gemini and AI in general does make sense, though. Google has invested boatloads of money into its AI models. So, it wants to integrate them into every product it offers. There were occasional segments that weren\u2019t entirely about AI. Such as with the Wear OS 6 announcement that went out well before the Keynote was over. For the most part, however, AI was the centerpiece. The hot topic of conversation. While Google\u2019s biggest announcement is hard to pin down, it had several showstoppers for the AI enthusiasts.<\/p>\n<p>Android XR, new Google AI subscription plans, the US rollout of AI Mode, and Gemini coming to Google TV were some of the bigger stories. You\u2019ll find all of that info and more in the brief descriptions below.<\/p>\n<p>With that said, here\u2019s every major new announcement that Google made at Google I\/O 2025.<\/p>\n<p>Google I\/O 2025: All Major Announcements<\/p>\n<p>As mentioned, Google talked a lot about AI this year. That shouldn\u2019t be surprising, as AI is Google\u2019s new baby. Whether we like it or not, it\u2019s going to be baked into everything Google does from now on. Interestingly, Google says it only mentioned AI 92 times during the keynote this year. That\u2019s compared to the 121 times it was mentioned during the Google I\/O 2024 keynote. That\u2019s also likely because Google mentioned Gemini 95 times this year as opposed to just \u201cAI.\u201d All of that said, you\u2019ll find brief overviews of all the Google I\/O 2025 announcements we covered below. Complete with relevant links to the dedicated articles. So let\u2019s jump into it.<\/p>\n<p>Google launches Android 16 QPR1 Beta 1<\/p>\n<p>Android 16 is fast-approaching, but there\u2019s still a little while longer to wait. However, Google just launched the first QPR1 beta build of Android 16 today at Google I\/O 2025. That means you can download it right now and install it on your device. If you\u2019re enrolled in the Android Beta Program, that is. This will be limited to Pixel devices for the moment. If you have a compatible device and you\u2019re enrolled, you\u2019ll end up getting a taste of Material 3 Expressive.<\/p>\n<p>Google announces Wear OS 6<\/p>\n<p>Fans of Google\u2019s smartwatch platform received a nice little treat today with the official announcement of Wear OS 6. Google talked about the new version last week, but it didn\u2019t say much about it. Now it has fully announced the next version of Wear OS. While talking more in-depth about some of the big changes coming to the platform. One of those big changes is the new design language of the operating system. Wear OS 6 will feature the Material 3 Expressive design language. This will bring a pretty notable visual change to how the software looks. Google is also adding some new Tile components for developers. As well as more customization options for watch faces that users can choose from.<\/p>\n<p>Google Gemini app users are now in the hundreds of millions<\/p>\n<p>Google\u2019s Gemini app has been used by tons of people since it first became available. It\u2019s easy to see why. It can provide some genuinely useful interactions that Google Assistant can\u2019t. Now, Google has shared just how much use the app has been getting since its launch. According to Google, the Gemini app now has 400 million active users every single month. That\u2019s a HUGE number for any app. That being said, this is Google\u2019s eventual replacement for Assistant. So, it\u2019s no wonder that so many people are weaving it into their daily lives.<\/p>\n<p>Gemini is coming to the Chrome browser<\/p>\n<p>Everyone knew this was coming eventually, but Google has just now made it official. Gemini is coming directly to the Chrome web browser with a huge suite of features, ready to make your browsing a little more convenient. It\u2019ll roll out first to the desktop version of Chrome. It will also only be available for users who subscribe to Google AI Pro and Google AI Ultra initially. You\u2019ll be able to do all kinds of things with it, including using it as a tutor for your kids.<\/p>\n<p>Google is adding a bunch of improvements to the Play Store<\/p>\n<p>AI might be Google\u2019s major focus for everything right now, but it wouldn\u2019t be a Google I\/O conference without something that touches on the Play Store. Seeing as the Play Store is the hub for Android apps. Today, Google announced several new improvements coming to the Play Store. This includes topic browse pages, new tools for developers to make subscriptions better, and audio samples that users can listen to for apps that provide audio content.<\/p>\n<p>Project Starline is now Google Beam<\/p>\n<p>Remember Project Starline? Google announced this years ago, and it\u2019s been a little while since we\u2019ve heard anything about it. As it turns out, you probably won\u2019t be hearing anything about Project Starline anymore. That\u2019s because it\u2019s now Google Beam, Google\u2019s official video conferencing setup that will make you feel like you\u2019re actually in the room with people on the call. Project Starline was already a pretty cool innovation. However, Google Beam seeks to take it to the next level with advancements. Of course, AI is integrated into this new version of the setup.<\/p>\n<p>Google IO AI Innovations: Gemini 2.5 and Deep Think<\/p>\n<p>Android Studio gets Gemini 2.5 Pro<\/p>\n<p>When we said Gemini was everywhere at Google I\/O this year, we weren\u2019t kidding. Android Studio will now be getting Gemini 2.5 Pro to help developers create apps using Gemini\u2019s powerful suite of tools and features. While there will be many things Gemini will help developers do in Android Studio, one of the more useful things will be the capability for Gemini to analyze app crashes. It will also figure out what caused the crash in the first place.<\/p>\n<p>Gemini is headed to vehicles<\/p>\n<p>When you think about upgrades to your vehicle, would you ever have imagined that Gemini would be one of those upgrades? Whether you have thought about it or not, that\u2019s exactly what\u2019s happening. Google officially announced that Gemini is coming to vehicles. Like all things Gemini in every other area it now resides, Gemini in vehicles is aimed at helping drivers be more efficient. For example, no one likes long drives that are long because of traffic and congestion. So, imagine a world where you tell Gemini in your car about the congestion, and it finds a way for you to skirt around it. That\u2019s just one potential possibility.<\/p>\n<p>Volvo spearheads Google\u2019s Gemini integration in vehicles<\/p>\n<p>You already know that Gemini is coming to vehicles, but did you know that Volvo is already working on integrating it into a new model? This new vehicle was shown off at Google I\/O 2025, and it\u2019s painting a promising picture of what\u2019s to come. We were lucky to check out a demo of this new software integration at the event, and alongside Gemini, \u2018Google Builtin\u2019 vehicles will also start to incorporate things like movies and games.<\/p>\n<p>Essentially turning your car\u2019s in-dash display into a full-on entertainment device. Naturally, this will only work when the car is parked, as it should. That being said, think about all the times you might be parked and sitting in your car. Waiting at a doctor\u2019s office, for your significant other to come out of the shops, or while you\u2019re waiting for your EV to charge. All these scenarios are about to get a lot less boring.<\/p>\n<p>Google TV will soon prompt users to leave app reviews<\/p>\n<p>App reviews can play an important role in helping developers fix issues (if there are any), but sometimes users simply forget to leave them. The same is true for Google TV as it is for Android, and Google wants to try and change that. To that end, it\u2019s introducing a new feature for developers who make apps for Google TV. This new feature will prompt users to leave reviews for apps, letting developers know what they like and dislike about them. Google says these prompts won\u2019t pop up while users are actively consuming content. Rather, they\u2019ll pop up when the UI is idle.<\/p>\n<p>Gemini will be able to control your phone<\/p>\n<p>Gemini Live is one of the more intriguing parts of Gemini, and soon it\u2019ll be getting a very useful feature that can help users be even more hands-free \u2013 the ability to control your phone. During Google I\/0 2025, Google showed off a demo of this working, and it seems genuinely helpful. The way it works is you can ask Gemini (once you open Gemini Live) to do something for you, like find a user guide on something, and it\u2019ll search the web and grab the file for you. It can even scroll through the guide for you if you direct it to look for something specific. On top of this, it will be able to make calls for you and ask questions.<\/p>\n<p>Google I\/O 2025 Hardware: Android XR Glasses and Partnerships<\/p>\n<p>Samsung\u2019s Project Moohan XR headset launches later this year<\/p>\n<p>Android XR has been briefly mentioned a few times leading up to today\u2019s event, but there were still plenty of unknown details about the ecosystem and the hardware that will run on the platform. Today, Google confirmed that Samsung\u2019s Project Moohan headset will be the first headset to run on Android XR. Google has also confirmed that the headset will be launching later this year, as suspected.<\/p>\n<p>Android XR gets its second developer preview<\/p>\n<p>Today, Google announced the second developer preview for the Android XR SDK, giving developers new tools and features to work with when developing apps for the upcoming platform. This includes a new Jetpack XR SDK that allows developers to play back videos in 180 degrees and 360 degrees. Additionally, Google is getting the Play Store ready for Android XR apps, so users have an easy time finding apps to install once Project Moohan launches later this year.<\/p>\n<p>Google unveils Android XR smart glasses<\/p>\n<p>Google\u2019s history with smart eyewear hasn\u2019t exactly been a bright spot in the industry after Google Glass, thanks to concerns over privacy and the release of that product honestly coming a bit too early. Now, though, with popular options like the Meta Ray-Bans on the market, Google has officially revealed its new vision for smart eyewear with the Android XR smart glasses. These glasses are built around Gemini and will offer tons of AI-powered features that can all be used hands-free.<\/p>\n<p>Google and XREAL announce Project Aura<\/p>\n<p>Android XR isn\u2019t just limited to Project Moohan and Google\u2019s own Android AR glasses, there\u2019s also Project Aura from XREAL. You may already know of XREAL from its collection of AR glasses like the XREAL Air 2 Ultra, One, and One Pro. XREAL is also making the Project Aura glasses for Google\u2019s Android XR platform, and the company says it wants Project Aura to \u201cset the standard for Android XR.\u201d It\u2019s too early to tell if it will be able to accomplish this, but it has a good background in this space, so there\u2019s a good chance it will be able to deliver. Project Aura will be powered by Gemini AI and provide users with a large field-of-view experience.<\/p>\n<p>Android XR has its first two eyewear partners<\/p>\n<p>In addition to officially announcing Android XR and when it will launch, Google also announced its first two eyewear partners who will be creating XR Glasses. Those two partners are Warby Parker and Gentle Monster. While Google didn\u2019t announce pricing or show off any designs, each of those two partners offers glasses that appear to have a significant gap in cost. So it\u2019s entirely possible that XR Glasses from each of them will have a difference in cost as well.<\/p>\n<p>The results are in: Google\u2019s XR Glasses are HOT!!!<\/p>\n<p>Android XR is coming later this year, and it\u2019s everything Google has seemingly been working toward since the not-so-graceful exit of Google Glass. That seems likely to all be forgotten when Android XR launches. We got to test out Google\u2019s XR Glasses for ourselves at Google I\/O 2025 and, well, it seems our own Alex Maxham is quick to toss aside his Meta Ray-Ban Smart Glasses for what Google plans to offer. They\u2019re lightweight, they look stylish, and after an impressive demo, we can\u2019t wait to buy a pair.<\/p>\n<p>Google IO Search Upgrades: AI Mode and New Features<\/p>\n<p>AI Mode is rolling out to US-based users on Android and iOS<\/p>\n<p>AI Mode is a powerful new tool within Google Search, and up until now, it\u2019s only been available for a small group of users. That\u2019s changing quite soon, as Google officially announced that it\u2019s coming to all users in the US on both Android and iOS devices. The rollout started on May 20, and it\u2019ll be expanding to all US-based users throughout the week. Additionally, Gemini 2.5 is coming to Search later this week.<\/p>\n<p>Gemini Live camera and screen sharing rolls out to Android and iOS<\/p>\n<p>Gemini Live\u2019s camera and screen sharing feature has been available for a short while prior to today\u2019s event, but only for a select number of phones. During the Keynote today, Google announced that the camera and screen sharing feature for Gemini Live is rolling out to Android and iOS devices. Google is expanding its reach and getting the feature into the hands of many more users, starting today. The feature will be available on all Android and iOS devices that have access to Gemini Live.<\/p>\n<p>Google Meet will soon feature real-time translation powered by Gemini<\/p>\n<p>Google Meet has turned out to be a very useful tool for video conference calls and video chats between family and friends, and now it\u2019s going to get even better with help from Gemini. Today, Google announced that Google Meet would be getting Gemini integration for real-time translation features. The translation will only support English and Spanish initially, but Google says more languages will be added in the coming weeks.<\/p>\n<p>Google wants Gemini to be a true universal AI assistant<\/p>\n<p>Gemini and Google\u2019s AI models in general are already pretty advanced, but Google wants to take Gemini even further. Beyond it being the evolution of Google Assistant, by making it a \u201ctrue universal AI assistant.\u201d What does that mean exactly? Well, in Google\u2019s terms, it\u2019s Gemini having the capability to do more than just process information. Google wants Gemini to be able to understand context, learn from your daily life, and anticipate things you might need throughout the day. All in the name of making your life simpler and you more efficient.<\/p>\n<p>AI Mode gets powered up with the new Deep Search feature<\/p>\n<p>If you\u2019re a fan of AI Mode and all that it already offers, you\u2019re probably going to love the new Deep Search feature. While this isn\u2019t available just yet, Google says it\u2019s coming this Summer and it\u2019s likely going to have a huge impact on AI Mode. As well as how many people use it on a daily basis. With Deep Search, AI Mode can process hundreds of searches for you and then research across all of the information it finds to create an \u201cexpert-level, fully-cited report.\u201d<\/p>\n<p>In other words, it should help you lessen the number of separate searches you have to do. Instead, it\u2019ll do all of those search variations for you in one search query and then give you everything at once. You\u2019ll still have to sift through a few things to find the best result. That being said, you won\u2019t have to do that for each separate search. So in theory, it should speed things up immensely.<\/p>\n<p>AI Mode\u2019s new \u2018Agent Mode\u2019 feature completes complex tasks for you<\/p>\n<p>Agent Mode might just be one of the more exciting AI announcements from the event today. AI Mode is Google\u2019s super-powered Google Search tool. Agent Mode, on the other hand, is a new feature coming to AI Mode that is designed to complete complex, mundane tasks that you probably would prefer not to do. They still need to be done, but wouldn\u2019t it be nice if they were done for you? Agent Mode will do just that, to a degree. In Google\u2019s example, Agent Mode can research apartment listings for you based on your criteria for a place, like monthly rent cost, whether or not it has an in-unit washer and dryer, and other factors. It can then find relevant results and even schedule tours for you to view those listings. While also adding those tour dates to your calendar, so you don\u2019t miss the appointments.<\/p>\n<p>Google TV will become smarter with Gemini soon<\/p>\n<p>If you\u2019ve ever thought your Google TV wasn\u2019t smart enough, don\u2019t worry, because in the near future, it\u2019s going to be a lot smarter thanks to Gemini integration. While there will be a few new features to benefit from, one of the main things users will get to enjoy is better content recommendations. According to Google, users will be able to \u201cspeak more naturally and conversationally to find content they\u2019d like to watch.\u201d<\/p>\n<p>AI Mode can be your personal shopper<\/p>\n<p>Many of us shop online these days, but sometimes we just don\u2019t have time to go through the process of finding what we want or need, add it to a cart, and check out with it. In the future, AI Mode will be able to do some of your online shopping for you. With the power of Gemini and its upcoming argentic capabilities, AI Mode will be able to track prices on items for you, and when it sees the price drops, it will have the ability to add that item to your cart and purchase it on your behalf. This isn\u2019t really something that people will need to have, but it sure will be a nice, convenient feature.<\/p>\n<p>Google I\/O 2025 Pricing: AI Ultra Plan and Subscription Details<\/p>\n<p>Paid AI subscriptions now come in two flavors<\/p>\n<p>Things are changing with Google\u2019s AI subscription plans, as Google announced two new plans for users who want to make the most of what the company has to offer. Technically, there\u2019s one new plan and one is simply being renamed. The old paid plan is now Google AI Pro, which is set at $19.99 a month. The new plan is Google AI Ultra, featuring Google\u2019s most powerful AI suite of tools at a whopping $249.99 a month. That\u2019s quite expensive, but hey, you get YouTube Premium for free with it!<\/p>\n<p>Google I\/O 2025 Creative Tools: Veo 3, Imagen 4, and Flow<\/p>\n<p>New Veo 3 video generation tool announced, with new \u2018Flow\u2019 film-making tool in tow<\/p>\n<p>AI is taking over and is being injected into every corner of every industry, and film-making seems to be no different. Today, Google announced Veo 3, the third generation of its AI-powered video generation tool. Alongside the Veo 3 announcement, Flow was introduced. Flow is a new filmmaking tool for the Veo 3 model that puts some powerful tools in the hands of creatives. From scenebuilders to camera controls, Flow is poised to make creating film projects as effortless as you want them to be.<br \/>\nThe post The Ultimate Google I\/O 2025 Wrap-Up Guide: All Killer, No Filler appeared first on Android Headlines.&#013;<br \/>\n<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/plus.maciejpiasecki.info\/wp-content\/uploads\/2025\/05\/Google-IO-2025-scaled-1.jpg\" width=\"2560\" height=\"1340\">&#013;<br \/>\nSource: ndroidheadlines.com&#013;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The Google I\/O 2025 keynote is now over, and with it, Google announced a ton of new stuff today. All [&hellip;]<\/p>\n","protected":false},"author":3,"featured_media":16507,"comment_status":"false","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-16506","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-bez-kategorii"],"_links":{"self":[{"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/posts\/16506","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/comments?post=16506"}],"version-history":[{"count":1,"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/posts\/16506\/revisions"}],"predecessor-version":[{"id":16508,"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/posts\/16506\/revisions\/16508"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/media\/16507"}],"wp:attachment":[{"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/media?parent=16506"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/categories?post=16506"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/tags?post=16506"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}