Apple has unveiled a new small language model called ReALM (Reference Resolution As Language Modeling) that will run on the phone and make voice assistants like Siri smarter by helping them understand context and ambiguous references.
Apple’s Foray into AI
This comes ahead of the launch of iOS 18 in June at WWDC 2024, where we’re expecting a new Siri 2.0, although it’s unclear whether that model will be integrated into Siri over time. Apple has been actively expanding its AI endeavors, evident in new models, tools, and partnerships.
ReALM: Advancing AI Efforts
ReALM is the latest announcement from Apple’s rapidly growing artificial intelligence research team. It’s aimed at improving existing models to make them faster, smarter, and more efficient. The company claims that it even outperforms OpenAI’s GPT-4 in some tasks. We’ll keep you updated on its integration into iOS 18.
The Future of Siri and AI
What does ReALM mean for Apple’s AI efforts? Apple seems to be taking a “release everything and see what happens” approach to AI at the moment. Rumors of partnerships with Google, Baidu, and OpenAI abound. However, Apple’s own developments in AI showcase its commitment to innovation in this domain.
Much research is focusing on ways to run AI models locally without requiring large amounts of data to be sent to the cloud for processing. This is necessary both to reduce costs and comply with Apple’s privacy requirements.
ReALM’s integration into Siri could enhance its contextual understanding, enabling more conversational capabilities without deploying large language models. Combined with other recent Apple research, this signifies the company’s ongoing investment in AI assistants rather than solely relying on third-party models, notes NIX Solutions.
Apple’s ReALM marks a significant step forward in enhancing Siri’s capabilities and underscores the company’s dedication to AI advancement. Stay tuned for updates on its integration and impact on iOS 18.