enhancing Siri, the virtual assistant integral to the iPhone experience. Apple's research papers suggest a concerted effort to develop a more efficient and intelligent Siri, potentially revolutionizing user interaction with the iPhone. A notable focus in Apple's research papers is on small language models (SLMs) designed to operate independently within devices.
One such model, ReALM (Reference Resolution As Language Model), is touted for its ability to perform tasks prompted by contextual language input. Analysts speculate that ReALM could play a pivotal role in enhancing Siri's capabilities. Additionally, Apple's research introduces a multimodal AI model dubbed ‘Ferret-UI,' aimed at executing precise tasks related to user interface screens while interpreting and acting upon open-ended language instructions.
This innovation hints at a future where verbal commands could seamlessly replace traditional finger gestures for iPhone navigation. Among other notable advancements, Apple's research introduces Keyframer, a tool purportedly capable of generating animations from static images, and an AI model for image editing. These innovations could significantly enhance the functionality of the Photos app, allowing users to perform complex edits with ease.
Read more on livemint.com