Apple unveiled ‘Apple Intelligence’—a suite of features that bring generative AI to iPhones, iPads and Macs. As a user, you can transcribe recorded calls, generate emails and summarize notes, erase objects from images, create illustrations and animations at will, and make emoticons. In September last year, Google had already unveiled similar features, which also include live transcription of audio recordings.
Samsung, too, is in the fray—the Korean electronics firm offers a native AI model running on its flagship devices to process audio recordings and phone calls, as well as summarize webpages. Not all of them so far. Apple clarified that while Siri, its on-device digital assistant, can process basic queries on your device, a broader query may need to access OpenAI’s ChatGPT based on the GPT-4o multimodal AI model.
Google’s Pixel phones rely on an internet connection to access large AI models hosted on cloud servers for many of its features. Samsung, too, does the same—the company offers a setting where users can choose to only use local AI features such as live transcribe a voice note, or access an on-cloud AI model to generate summaries of recorded voice notes. Apple’s AI features will come only to its ‘Pro’ iPhones for now, although all of Google’s latest phones support its AI chops.
Samsung only has AI on its flagship Galaxy S24 series, but may expand to new devices next month. It all depends on your phone’s processor—for now, only flagship ones by Qualcomm and MediaTek will support local AI processing. Privacy experts have raised concerns around what could happen with all the personal data that each local AI model can see on any phone.
Read more on livemint.com