Siri was introduced and integrated into iOS and macOS over ten years ago. In that time, little has changed about it. There are articles from 2016 complaining about how far it was behind Google and Cortana (RIP).
In the meantime, LLMs have changed the game when it comes to language understanding, context, and ability to execute based on natural language instruction. All the tools are here: Speech-to-text through libraries like Whisper, and of course the LLMs themselves which interpret language and context to a revolutionary degree. Text-to-speech is already solved.
So, ignoring the “AI hype” and seeing what these tools actually are and can do, it naturally fits that a better personal assistant could be created by combining these pieces and leveraging the existing iOS and macOS APIs to improve Siri’s functionality by an order of magnitude, or more.
Apple remains the only big tech company not publicly diving head-first into the LLM race, but it’s relatively well-known that they’re developing one internally. As that was reported last July right around WWDC 2023, it was obviously not ready for an announcement. Besides, that one was all about Vision Pro anyway.
So, that point, coupled with the fact that Apple tends to be last on the trend train but offer a more polished experience, would make WWDC 2024 the perfect time to reveal an “all new” or revamped Siri which could run locally on, say, newer iOS devices or Macs with Apple Silicon. I use SuperWhisper on my M2 MacBook Air and it works amazingly well. It would also be a good carrot to get people to upgrade their phones, iPads, and Macs. Besides, what else would be important at WWDC next year? I don’t see much else on the horizon, other than the real launch of Vision Pro.
Anyhoo, my random unsubstantiated thoughts. Talk among yourselves.