Recap of Apple Event '24: Coolest (AI) Features

Published: Sep 10, 2024 Duration: 00:11:39 Category: Education

Trending searches: apple event updates
Introduction to Apple Event '24 hey everyone and welcome back to the channel so app's big September event has wrapped up and while we didn't get any jaw-dropping surprises there are a few AI driven updates that are worth look into some of these features are refinements While others push Apple further into the ipace today I'm going to break down the updates that Apple unveiled from smarter notifications to AI generated images and of course the latest improvements to Siri let's get into it AI in Writing let's start with AI writing tools now this is not a revolutionary concept if you've used grammarly or chat gbd before but Apple has integrated similar capabilities directly into the operating system this means you can now proofread Rite and even adjust the tone of your text in real time for instance if you're writing a long email or want to condense it down into key points Apple's AI or apple intelligence will do that automatically let's say you're working on an email but you're struggling with the tone you want it to sound more formal or maybe more casual instead of rewarding everything manually the AI can adjust that for you it can also summarize lenty documents or messages which is especially handy if you deal with a lot of tax this could be really helpful for sending emails or even content creators drafting scripts for me i' would probably use it to rewrite quick notes or emails when I'm in a rush it's not groundbreaking but it's fracti practical next up is smart replies you Smart Reply might have seen this in Gmail or even Google messages but Apple's bringing it to the iPhone basically the AI reads the context of a conversation and suggests replies so if a friend says hey let's catch up later instead of just setting a simple sh the AI might suggest something more relevant like that sounds good I'm free at 5:00 p.m. I can see this being really useful when you're in a rush for example if you're on the go and don't have time to type out a thoughtful response the AI can handle it that said it's not exactly new tag we've seen this on other platforms for years Apple's just integrating it more tightly with their own ecosystem one thing I wondered though is how smart it will really be sometimes these automated replies feel a bit too generic or impersonal if Apple's AI can genuinely understand the context well enough to suggest thoughtful human-like responses that's a win but if it's just offering hand sponsors it might not add much value beyond the saving a few Priority Notifications seconds priority notifications is where Apple AI is trying to simplify your day-to-day Life by managing the fluids of alerts we get instead of showing every notification you receive it summarizes them and highlights the ones that matter the most the AI prioritizes the ones that are relevant based on your usage patterns and interactions what's interesting here is how Apple is using machine learning to understand which notifications you actually engage with over time it learns what is important to you so if you're someone who never opens Instagram alerts but it always checks email from your boss the AI will push the emails to the top and let the rest take a backseat it's essentially filtering out the noise to help you stay focused this could be really helpful if you like me get tons of notifications every day whether it's from social media apps or even email newsletters it's easy to get overwhelmed instead of constantly checking my phone for every little thing I would see a clean summary of what's really important however I do wonder if people will trust AI enough to manage their notifications for that it's a tricky balance right what the AI deems important might not always align with what you want to prioritize there's always the risk of missing something you would actually want to see so while this is a neat feature it will depend on how accurately it can learn your preference Visuals: GenMoji & Image Playground Apple's pushing visual generation with tools like gen mooji and image Playground now Genji lets you create your own emojis based on text descriptions for example you could describe something as specific as Cowboy frog and the AI will generate an emoji that matches honestly this feels more like a fun gimmicky feature I can see people playing around with it for social media or group chats but it's not something that's going to change how you use your iPhone then there is image playground which allows you to create fully generated images from sketches or descriptions if you're someone who likes to design or needs quick visuals this could be handy imagine sketching a basic shape of a house or a tree and then having a y fill in the details colors and textures this uses Apple's AI algorithms to interpret your input and create something polished however I do not personally know people whose catches on iPad or iPhone I usually see people during meetings using pen or paper for that and then submitting the request to a designer for a digital copy to my knowledge I'm not quite sure how broad this is going to be just because people tend to use pen and paper that said apple is not the first to offer this kind of AI image generation we've seen similar capabilities with tools like Del and mid Journey Apple's Advantage here is its seamless integration into the ecosystem another AI driven feature is Clean Up cleanup which lets you remove unwanted objects from your photos with a single T it's pretty much Apple's version of Google's Magic Eraser and it works similarly highlights something you don't want in the shop like a person or an object and the AI automatically removes it while blending the background this uses machine learning to understand the context of the photo and fill in the removed space with surrounding pixels so if there is an unwanted person in the background the AI reconstructs the area around it make the edit seamless personally I've had plenty of moments where it was a great shot was ruined by something in the background whether it's someone walking through the frame or just random mess this could be a lifesaver for a quick photo edit I've used similar tools before but having it built directly into the iPhone photos app makes it super convenient that said apple is a little late to the game here Google has been doing this for a while and apps like Photoshop have had object removal from for years still it's a welcome addition especially for those who want quick and easy edits without needing extra Siri Update software all right Siri has gotten a significant boost this year with AI knowledge in ch GDP integration this allows Siri to handle more complex tasks and provide better responses for example if you are trying to find something specific on your phone like a node or a receipt from month ago Siri can dig through your apps messages and emails to find it the key here is that Siri now remembers the context of previous interactions within a conversation so if you ask Siri to adjust a setting or pull up some information it keeps track of what you've already requested and builds on that the integration with ch GPD also means that Siri can generate more conversational and newest responses tapping into a much larger database of knowled while this sounds great on paper I'm curious to see how fluid this integration really is Siri has been playing catchup with Google assistant and Alexa for years and this feels like Apple's attempt to breach that Gap but if it still stumbles over basic tasks or can handle real word queries well it might not live up to its potential CH GP integration could be the game changer here but the execution will be the Live Captions key apple is introducing live captions across their ecosystem this feature will automatically capture any audio content in real time in including FaceTime calls videos and even podcasts this is particularly useful for those who are deep or hard of hearing but it can also see it being helpful for anyone who prefers reading along with audio content or struggle to understand a foreign language whether you're watching a video in a noisy environment or just need a little extra Clarity this feature can provide it live captions are't new Google's been doing this for a while but having it built into iOS and Macos is a solid Edition I don't think it's going to be a headline Fe for most users for those who need it it will be quite useful Apple has also enhanced its sound Sound Adjustments adjustment and noise isolation features especially for airpods pro the AI driven noise constellation will now block out background noise even more effectively during calls and voice isolation feature Mak sure your voice stands out clearly this works by focusing the microphones on the frequencies associated with your voice while dampening other ambient sounds whether you're in a crowded cafe or walking down on a busy street the AI isolates your voice making it easier for the person on the other end to hear you this could be really useful if you're someone who takes a lot of calls on the go I often find myself dealing with background noise while driving or walking around so having better noise isolation could make a noticeable difference in call Quality it's not entirely new though noise isolation is something airports have done well for a while so this feels more like an incremental update rather than a groundbreaking one still for those who rely on their airpods for calls it's a welcome Notes In-Built Calculator (not AI Improvement one of the smaller features that I actually find really handy is inbuild calculator in notes now you can do quick calculations without having to leave the notes app it's a minor feature but one that I think a lot of people will appreciate especially if you're someone who takes notes during meetings or on calls I've personally had a plenty of moments where I'm jouling something down and need to do a quick calculation and switching between apps just adds an necessary FR this streamlines that process it's not revolutionary by any means but it's one of those quality of life features that just makes the overall experience smoother I think more features like this small but practical improvements are what people actually want to see so does the round down of Apple AI driven features from the September event honestly none of these features are revolutionary but they do reflect that ongoing refinement of its ecosystem we're seeing more integration of AI into everyday tasks from writing tools and smart replies to smarter notifications and photo editing it's clear apple is pushing toward making its devices more intelligent but without drastically changing the user experience for me the most practical updates are probably the cleanup feature for photo editing and the AI writing tools these are things I can see myself using pretty regularly feature live captions are important steps forward to access ability and I'm glad Apple's putting emphasis there but at the end of the day a lot of these updates feel more like pitching up to the competition than Breaking new ground what about you which feature do you think will actually make a difference in your daily life let me know in the comments and as always don't forget to like subscribe and hit that notification Bell so you don't miss any next Deep dive into all things tag I'll see you in the next video [Music]

Share your thoughts