![WWDC 2025 Interview: Federighi and Joswiak on Siri Delay, Apple Intelligence, and iPadOS 26 [Video] WWDC 2025 Interview: Federighi and Joswiak on Siri Delay, Apple Intelligence, and iPadOS 26 [Video]](/images/news/97579/465700/465700-64.png)
WWDC 2025 Interview: Federighi and Joswiak on Siri Delay, Apple Intelligence, and iPadOS 26 [Video]
Posted June 11, 2025 at 3:11am by iClarified
Apple's Craig Federighi and Greg "Joz" Joswiak recently sat down for an in-depth interview with Mark from Tom's Guide and Lance from Tech Radar, providing a candid look at the company's latest software announcements from WWDC 2025. The discussion offered new details on the delay of advanced Siri features, the philosophy behind Apple Intelligence, and the major overhaul of iPadOS.
One of the most significant topics was the delay of advanced Siri features that leverage on-device personal context, which were initially expected this year. Federighi provided a detailed explanation of the development process, revealing that Apple was essentially working on two different versions of the underlying technology. "We had... really two phases to... two versions of the ultimate architecture that we were going to create," he said. The first version, which he called the "V1 architecture," was the one Apple initially planned to ship. "We had at the time high confidence that we could deliver it. We thought by December and if not, we figured by spring," Federighi admitted.
However, as development continued, the team realized the V1 approach had significant limitations. "We found that the limitations of the V1 architecture weren't getting us to the quality level that we knew our customers needed and expected," he stated. This led to a crucial decision. "It wouldn't meet our customer expectations or Apple's standards. And that we had to move to the V2 architecture." This shift required more time, prompting the public delay. When asked about a 2026 timeline, he confirmed it was accurate, though he noted they "don't want to pre-communicate a date" until it's ready. He clarified that V2 was not a complete start-over: "The V1 architecture was sort of half of the V2 architecture and now we extended it across... to make it a pure end-to-end architecture."
Both executives reiterated that Apple's approach to artificial intelligence is not about creating a single "destination" app or chatbot. "This wasn't about us building a chatbot," Federighi stated. "We weren't defining what Apple Intelligence was to be our chatbot." Joswiak expanded on this, explaining, "Apple Intelligence isn't a destination for us... There's no app Apple Intelligence. There's Apple Intelligence making all the things you do every day better." Federighi pointed to the numerous AI features that did ship—including Writing Tools, summarization, and photo cleanup—as examples of this broad, platform-wide approach. Part of this strategy includes opening up its on-device foundation models for the first time, allowing developers to build new intelligent features directly into their apps.
The discussion also touched on the enthusiastic crowd reaction to the revamped iPadOS 26. Federighi explained the new multitasking system—which includes a more advanced windowing system, tiling, Exposé, and a menu bar—was a natural evolution driven by more powerful hardware and changing user needs. "This is what you've all been waiting for: a new windowing system on iPad," he said. He explained the goal was to make the iPad more capable and "Mac-like" for productivity without losing its touch-first identity. He used the "cars and trucks" analogy to explain their different but overlapping roles. "The Mac is, you know, it's got the trailer hitch on it. It's got the flat bed," he said, distinguishing its utility focus from the iPad's ultimate versatility.
The visual overhaul extends across all of Apple's platforms. Federighi detailed the company's new universal design language, Liquid Glass, which was inspired by visionOS. He described its unique properties, explaining that it "at once allows the interface to feel edge-to-edge... But at the same time, glass provides these objects that actually make the buttonness... feel almost more clear and salient." He also highlighted its adaptive nature, saying, "...we were able to build adaptive glass that changes the way it's transmitting color, that even can flip from a dark glass to a light glass..." The conversation also covered updates to other platforms, including dramatically improved Personas for Vision Pro and a host of new features that bring this unified design to every corner of the Apple ecosystem.
You can watch the full interview with Craig Federighi and Greg Joswiak below...
One of the most significant topics was the delay of advanced Siri features that leverage on-device personal context, which were initially expected this year. Federighi provided a detailed explanation of the development process, revealing that Apple was essentially working on two different versions of the underlying technology. "We had... really two phases to... two versions of the ultimate architecture that we were going to create," he said. The first version, which he called the "V1 architecture," was the one Apple initially planned to ship. "We had at the time high confidence that we could deliver it. We thought by December and if not, we figured by spring," Federighi admitted.
However, as development continued, the team realized the V1 approach had significant limitations. "We found that the limitations of the V1 architecture weren't getting us to the quality level that we knew our customers needed and expected," he stated. This led to a crucial decision. "It wouldn't meet our customer expectations or Apple's standards. And that we had to move to the V2 architecture." This shift required more time, prompting the public delay. When asked about a 2026 timeline, he confirmed it was accurate, though he noted they "don't want to pre-communicate a date" until it's ready. He clarified that V2 was not a complete start-over: "The V1 architecture was sort of half of the V2 architecture and now we extended it across... to make it a pure end-to-end architecture."
Both executives reiterated that Apple's approach to artificial intelligence is not about creating a single "destination" app or chatbot. "This wasn't about us building a chatbot," Federighi stated. "We weren't defining what Apple Intelligence was to be our chatbot." Joswiak expanded on this, explaining, "Apple Intelligence isn't a destination for us... There's no app Apple Intelligence. There's Apple Intelligence making all the things you do every day better." Federighi pointed to the numerous AI features that did ship—including Writing Tools, summarization, and photo cleanup—as examples of this broad, platform-wide approach. Part of this strategy includes opening up its on-device foundation models for the first time, allowing developers to build new intelligent features directly into their apps.
The discussion also touched on the enthusiastic crowd reaction to the revamped iPadOS 26. Federighi explained the new multitasking system—which includes a more advanced windowing system, tiling, Exposé, and a menu bar—was a natural evolution driven by more powerful hardware and changing user needs. "This is what you've all been waiting for: a new windowing system on iPad," he said. He explained the goal was to make the iPad more capable and "Mac-like" for productivity without losing its touch-first identity. He used the "cars and trucks" analogy to explain their different but overlapping roles. "The Mac is, you know, it's got the trailer hitch on it. It's got the flat bed," he said, distinguishing its utility focus from the iPad's ultimate versatility.
The visual overhaul extends across all of Apple's platforms. Federighi detailed the company's new universal design language, Liquid Glass, which was inspired by visionOS. He described its unique properties, explaining that it "at once allows the interface to feel edge-to-edge... But at the same time, glass provides these objects that actually make the buttonness... feel almost more clear and salient." He also highlighted its adaptive nature, saying, "...we were able to build adaptive glass that changes the way it's transmitting color, that even can flip from a dark glass to a light glass..." The conversation also covered updates to other platforms, including dramatically improved Personas for Vision Pro and a host of new features that bring this unified design to every corner of the Apple ecosystem.
You can watch the full interview with Craig Federighi and Greg Joswiak below...